tiprankstipranks

Datadog platform expands to support generative AI applications troubleshooting

Datadog (DDOG) announced new capabilities that help customers monitor and troubleshoot issues in their generative AI-based applications. The company announced a broad set of generative AI observability capabilities to help teams deploy LLM-based applications to production with confidence and help them troubleshoot health, cost and accuracy in real time. These capabilities include integrations for the end-to-end AI stack: AI Infrastructure and compute – NVIDIA (NVDA), CoreWeave, AWS (AMZN), Azure (MSFT) and Google Cloud (GOOGL); Embeddings and data management – Weaviate, Pinecone and Airbyte; Model serving and deployment – Torchserve, VertexAI and Amazon Sagemaker; Model layer – OpenAI and Azure OpenAI; Orchestration framework – LangChain. Additionally, Datadog released in beta a complete solution for LLM observability, which brings together data from applications, models and various integrations to help engineers quickly detect and resolve real-world application problems like model cost spikes, performance degradations, drift, hallucinations and more to ensure positive end user experiences. Datadog’s AI/LLM integrations are now generally available.

Published first on TheFly – the ultimate source for real-time, market-moving breaking financial news. Try Now>>

See Insiders’ Hot Stocks on TipRanks >>

Read More on DDOG:

Disclaimer & DisclosureReport an Issue