Sign in

What is an "Expert"? How do we create our own expert?

Add Jenkins credentials

Add a key-value pair

Add credentials for various integrations

Add AWS credentials

Add Jira credentials

Add Slack credentials

Add Grafana credentials

Add Azure credentials

Add GitHub credentials

Process Grafana Alerts

Managing workspaces and access control

DagKnows Architecture Overview

Managing Proxies

Setting up SSO via Azure AD for Dagknows

All the experts

Enable "Auto Exec" and "Send Execution Result to LLM" in "Adjust Settings" if desired

(Optionally) Add ubuntu user to docker group and refresh group membership

Deployment of an EKS Cluster with Worker Nodes in AWS

Adding, Deleting, Listing DagKnows Proxy credentials or key-value pairs

Kubernetes pod issue

Comprehensive AWS Security and Compliance Evaluation Workflow (SOC2 Super Runbook)

AWS EKS Version Update 1.29 to 1.30 via terraform

Instruction to allow WinRM connection

MSP Usecase: User Onboarding Azure + M365

Post a message to a Slack channel

How to debug a kafka cluster and kafka topics?

Docusign Integration Tasks

Open VPN Troubleshooting (Powershell)

Execute a simple task on the proxy

Assign the proxy role to a user

Create roles to access credentials in proxy

Install OpenVPN client on Windows laptop

Setup Kubernetes kubectl and Minikube on Ubuntu 22.04 LTS

Install Prometheus and Grafana on the minikube cluster on EC2 instance in the monitoring namespace

Sample selenium script

update the EKS versions in different clusters

AI agent session 2024-09-12T09:36:14-07:00 by Sarang Dharmapurikar

Install kubernetes on an ec2 instance ubuntu 20.04 using kubeadm and turn this instance into a master node.

Turn an ec2 instance, ubuntu 20.04 into a kubeadm worker node. Install necessary packages and have it join the cluster.

Install Docker

Parse EDN content and give a JSON out

GitHub related tasks

Check whether a user is there on Azure AD and if the user account status is enabled

Get the input parameters of a Jenkins pipeline

Get the console output of last Jenkins job build

List my Jenkins pipelines

Get last build status for a Jenkins job

Trigger a Jenkins job with param values

List all the resource ARNs in a given region

What is an "Expert"? How do we create our own expert?

There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

DagKnows allows you to create your own AI agent. We call them Experts. If you want to give special instructions to AI to do something but don't want to repeat those instructions every time you type a prompt, you can create an expert. For example, suppose you want to build an AI agent that handles your queries related to your cloud cost, you can create an expert and add all your special instructions into the expert. It is like training a new hire to make them an expert. Giving them all the instructions so that they can be productive.


To create an expert,

  • Go to Experts tab in the left nav bar
  • Click Create Expert
  • Fill in the template.
  • You must give a name to the expert. No whitespaces or special characters except '_'. for example aws_cost_expert
  • A title is necessary. For example: "Expert in handling cloud cost related tasks". Our AI engine uses this title to decide to decide which expert to assign a given task to.
  • Enter a Description. Here you add all the instructions for the expert, things like what it should do and should not do etc.
  • You can also add a list of Environment variables the expert has access to. Each environment variable entry would have the variable name and the variable description.
  • Keywords is just a comma separated list of keywords related to the agent. If any of these keywords is mentioned in a user's prompt, the expert instructions in the description section will be appended to the main system prompt and sent to LLM. Be careful about the keywords. If they are too generic then every time a user prompt includes any, the prompt will be appended and it will bloat the system prompt. If it is too rare then a user may forget to mention the keyword in their query and LLM won't receive the expert instructions and then may hallucinate.
  • Collaborators is a list of other experts this expert collaborates with. The collaborators' expert prompts will also be appended to the system prompt. For example, our aws_cost_expert can collaborate with aws_expert which may have some other AWS related generic instructions.
  • You can add Tags to the expert for classification. However the most important tag the expert must has is "expertprompt". It is automatically added for you by default. If you want to disable the expert, you can remove this tag. If you want to enable the expert, just add back the "expertprompt" tag.


How can I help?