New Relic has released the first AI helper for observability generated by New Relic Grok. The AI assistant is created to simplify the observability experience for all engineers, regardless of previous experience. Engineers can gain insight from telemetry data sources via natural language prompts using New Relic Grok, which leverages the massive language model of OpenAI and New Relic’s integrated telemetry data platform. Engineers can use New Relic Grok to perform operations such as setting up instrumentation, debugging problems, creating reports, and managing accounts using plain queries in over 50 languages. The goal of the AI Assistant is to raise observability, decrease the need to manually classify data, and reveal insights from fragmented telemetry data sources. Take a closer look at this article to learn about New Relic AI’s launch of the world’s first Generated AI Assistant for observability.
Engineers Need Observability
Staples and his team remind us that software engineers rely on observability to run new-age digital business services. They must gain real-time insight into operations, system health, and customer experience. But often, we are faced with piles of siloed, irrelevant telemetry data and a query-based troubleshooting interface that is difficult to use. In many cases. The problem boils down to the fact that engineers need to gain the experience to ask the right questions. Therefore, we need an observable platform with some automation and self-navigation awareness.
It’s better than any form of stick-shift shunt. New Relic Grok provides an integrated telemetry data source to ensure high-quality generated AI response. The team behind the technology says that as the value of generative AI is recognized, it will drive “tool and data integration” into New Relic. In other words, the company argues that its services will attract the masses because they serve their purpose.
Unleashing Observability for All
Observability could be as simple as requesting New Relic Grok AI, “What’s immoral with my browser app?” is now as effortless as ordering New Relic Grok, “What’s wrong with my browser app? When combined with the Large Language Model and the New Relic integrated telemetry data platform, a simple question can provide deep insight into the system’s state. Within the New Relic platform, ask New Relic Grok a question using the familiar chat interface and receive a detailed analysis, root cause insights, and recommended fixes. All engineers-from designers to operations, protection, product, consent, and QA become observability pros, fixing situations faster, reducing outages, and increasing development speed and innovation.
Grok Mountains of Data
If you’ve ever been interested in building or maintaining software, you know how difficult it can be to distinguish signal from noise. When navigating the dashboards, documents, runbooks, alerts, anomalies, logs, traces, and other veritable fun places, it can be challenging to integrate large amounts of data to get a clear gist. And you hope you are monitoring the right things from the start. Unsurprisingly, observability is limited to a select few skilled users in many cases. But imagine having an intelligent assistant help you scrutinize data to find root causes, fix code-level errors, instrument parts of your stack, or create and execute queries.
GenAI Requires Integrated Data
Like other solutions driven by Generative AI (GenAI). New Relic Grok has access to more data, making it better and more powerful. And when that data is under one roof, it can supply noteworthy insights more efficiently and understand faster. New Relic’s all-in-one observability platform is that roof, bringing together data, context, tools, and teams into one integrated experience. Combining LLM with our extensive integrated telemetry data platform improves AI response quality and accelerates AI learning. In addition, New Relic Grok allows you to approach problems from myriad angles thanks to a single integrated database that generates insights from more than 30 correlated functions.
Who can use New Relic Grok AI?
New Relic Grok makes observability unrestricted to all engineers. Because New Relic Grok counts on unadorned language input, users do not need to know how to create queries to collect data.
New Relic Grok can do the following.
Developers and QA: Identify errors down to the line of code in the IDE, get clarification, and review suggested fixes.
DevOps engineers: Get reports on the impact of recent changes and monitor changes in direction.
SecOps: Instantly surface new security vulnerabilities and correlate them with recent changes.
Product: View high-level system status reports and identify opportunities to improve the end-user experience.
Support: Quickly gather insight into recent anomalies and issues.
New Relic administrators: Get help managing users, data retention rules, ingestion monitoring, and more.
Executives: stay on top of every launch.
New Relic Grok AI enables all engineers to:
Set up instrumentation and monitoring:
- Identify instrumentation gaps.
- Provide direction on instrumentation services.
- Set up missing alerts.
- Automate alerts using Terraform.
Root causes isolation: use chat to ask questions like “Why is the service not working?” New Relic Grok analyzes telemetry data and recent changes to determine the root cause.
Debug code-level issues: Using CodeStream and Error Inbox, New Relic Grok automatically identifies code-level errors in the IDE. And analyzes code, stack traces, and operational telemetry to suggest fixes.
Generate Reports and Dashboards: In just a few words, anyone can generate system or app health reports that include anomalies, and problems. And recent developments—no more filtering dashboards.
Natural Language Queries:
- Create analysis queries in plain language in 50+ languages.
- Translate query results into simple descriptions.
- Easily share them with all teams, including executives.
Manage administrative tasks: Let New Relic Grok manage accounts, users and user access, data retention rules, usage, billing, and more.
What questions can I ask New Relic Grok AI?
Root cause? Ask New Relic Grok AI: Ask questions that concern you, such as “Why isn’t my cart working?” or whatever is bothering you. New Relic Grok analyzes telemetry data and software stacks to provide the right insights, root causes, and solutions.
Create queries using plain language: Using OpenAI’s Large Language Model (LLM), New Relic Grok can transform human conversations into queries and turn query results into simple descriptions. Everyone, including executives, can have the same perception in more than 50 languages.
Retrieve AI-suggested code fixes: New Relic Grok automatically identifies code-level errors in the IDE and examines display telemetry, and code snippets. And stack traces to direct code fixes.
Instrument, report, and manage like a pro: Quickly build mature observability practices. New Relic Grok helps you automatically set up instrumentation and watches, affect system health reports, and manage reserves and users.
Frequently Asked Questions
When will New Relic Grok AI be available? What is its price?
New Relic Grok AI is currently in preview. Consumers who order before admission to New Relic Grok will be counted in the preview as they become available. Please note that we cannot guarantee a specific date when access or features will be available for preview. New Relic Grok will be free to all participants during the preview. The Final pricing for New Relic Grok will be determined at general availability.
Why is it called New Relic Grok AI?
Grok means “to understand profoundly and intuitively. The term grok has been adopted by technologists in a variety of ways. By definition, grok is a perfect candidate to convey New Relic’s mission of making observability available to all. And allowing everyone to gain insight into the system’s state by asking honest questions.
Conclusion
New Relic Grok AI aims to address these hurdles. It delivers high-quality insights by deploying generative AI on hyperscale, integrated telemetry data sources. Making it easier for engineers to understand complex systems and making observability available to all engineers, regardless of experience.