For this exercise Michal and I wanted to combine forces and prompt from another assignment, in our class, “Visual AI Studio For Art and Technology” with Carla Gannis and AV Marraccini. For this class our assignment was to “develop a relationship with an online identity. This relationship can be platonic or romantic in nature. At the end of the week, WRITE a ❤love letter or 💔breakup letter to your AI friend, based on your feelings about the experience. INCLUDE the AI's response to you, and/or add any personal anecdotes about the experience.” We had the idea that it might be fun to try and program our Discord bots for this scenario.

We had the idea that it would be fun to try and get the bots to carry on a dialog without too much intervention from us, and see what we could learn from the experience. We picked the meta/meta-llama-3-70b-instruct LLM on Replicate as our LLM to connect to our bots via a slash command.

The first hurdle was to write the appropriate script to make sure that the API was called with our slash command and the users additional “prompt” input.

Screenshot 2024-10-16 at 10.33.59 AM.png

We decided to use the command “/tell me a story about: “prompt”, where the user could enter any topic as a prompt for the bot to begin the conversation.

Screenshot 2024-10-15 at 2.41.37 PM.png

Screenshot 2024-10-15 at 2.40.52 PM.png

none.png

Screenshot 2024-10-16 at 10.52.11 AM.png

At this point in our process both bots had the same system prompts as Replicates documentation.

const input = {
  top_k: 0,
  top_p: 0.9,
  prompt: "Work through this problem step by step:\\n\\nQ: Sarah has 7 llamas. Her friend gives her 3 more trucks of llamas. Each truck has 5 llamas. How many llamas does Sarah have in total?",
  max_tokens: 512,
  min_tokens: 0,
  temperature: 0.6,
  system_prompt: "You are a helpful assistant",
  length_penalty: 1,
  stop_sequences: "<|end_of_text|>,<|eot_id|>",
  prompt_template: "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\\n\\nYou are a helpful assistant<|eot_id|><|start_header_id|>user<|end_header_id|>\\n\\n{prompt}<|eot_id|><|start_header_id|>assistant<|end_header_id|>\\n\\n",
  presence_penalty: 1.15,
  log_performance_metrics: false
};

for await (const event of replicate.stream("meta/meta-llama-3-70b-instruct", { input })) {
  process.stdout.write(event.toString());
};

Our next goal was to have our bots, April and June, respond only to each other, prompted by either of us using the slash command ‘/tell_me_about: prompt’. We did this by identifying the userID and channelID for our respective bots, and added this into the ‘reply’ logic.

Luckily we had success! The bots began chatting to each other right away with great enthusiasm.

But we definitely had some funny errors like the bots referencing or talking to themselves …