<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Datadog on Sam's Space</title><link>https://blog.overandunder.au/tags/datadog/</link><description>Recent content in Datadog on Sam's Space</description><generator>Hugo</generator><language>en</language><lastBuildDate>Fri, 15 May 2026 17:33:57 +1000</lastBuildDate><atom:link href="https://blog.overandunder.au/tags/datadog/index.xml" rel="self" type="application/rss+xml"/><item><title>Learning About Datadog MCP</title><link>https://blog.overandunder.au/posts/learning-about-datadog-mcp/</link><pubDate>Fri, 15 May 2026 17:33:57 +1000</pubDate><guid>https://blog.overandunder.au/posts/learning-about-datadog-mcp/</guid><description>&lt;p&gt;With a bit of time on my hands recently between projects, I thought I&amp;rsquo;d take the opportunity to dig into some of the newer innovations in the technology space and get more across genAI tools and how these work and might be able to be used outside of a simple web chat interface. One area that we&amp;rsquo;d thrown around as potentially interesting was seeing what the capabilities were of the (still in beta) MCP server from Datadog. With a vague idea of wanting to put together a system that would be able to be hosted for a customer in their own environment, with access to their Datadog account and exploring how these systems might be able to be delivered as a service offering to clients I started looking into options. For the end to end offering I decided to work with the Datadog MCP server, Langchain to control the model and AWS Bedrock for serving the LLM itself.&lt;/p&gt;</description></item></channel></rss>