• Pivot 5
  • Posts
  • ChatGPT can now control a robot arm

ChatGPT can now control a robot arm

Pivot 5: 5 stories. 5 minutes a day. 5 days a week.

1. ChatGPT can now control a robot arm

Researchers at UC Berkeley and ETH Zurich have used OpenAI's GPT-4o large language model to teach cheap robot arms to clean up spills. The robot arms, given access to a sponge, were trained to identify a nearby spill in just four days.

The arms were trained on around 100 demos and can be built at home using a YouTube playlist. The arms are entirely open source and can be built at home using a YouTube playlist. The experiment is a proof-of-concept for a robot control architecture, demonstrating how open-source is democratizing robotics.

2. Can Pictionary and Minecraft test AI models’ ingenuity

Pivot 5 made with Midjourney

AI enthusiasts are using games as a way to test AIs' problem-solving skills. Paul Calcraft and Adonis Singh have created apps where two AI models can play a Pictionary-like game, forcing models to think beyond their training data.

Minecraft is also considered an un-gameable benchmark, giving models control over a character and testing its ability to design structures. This approach is not new, but it provides a visual, intuitive way to compare how a model performs and behaves.

Read the full story here

3. Apple’s iOS 18.2 public beta starts opening up access to more AI features

Pivot 5 made with Midjourney

Apple has released iOS / iPadOS 18.2 into public beta, introducing features like Genmoji, Image Playground, ChatGPT integration, Visual Intelligence for iPhone 16 cameras, and OpenAI's ChatGPT. 

Access to some features is still on a secondary waitlist, while others require an account. Apple also released the public beta for macOS 15.2, which includes similar features to iPhones and iPads but lacks Visual Intelligence and Genmoji, which only works on iPhones and iPads.

Read the full story here

4. AI-driven mobile robots team up to tackle chemical synthesis

Pivot 5 made with Midjourney

Researchers at the University of Liverpool have developed AI-driven mobile robots that can perform chemical synthesis research with extraordinary efficiency. The 1.75-meter-tall robots were designed to tackle three primary problems in exploratory chemistry: performing reactions, analyzing products, and deciding what to do next based on data.

The robots made the same or similar decisions as a human researcher but made them on a quicker timescale than a human, which could take hours. The AI logic processed analytical datasets to make autonomous decisions, making the tasks more efficient and accurate.

Read the full story here

5. SambaNova and Hugging Face make AI chatbot deployment easier with one-click integration

Pivot 5 made with Midjourney

SambaNova and Hugging Face have partnered to launch a new integration that allows developers to deploy ChatGPT-like interfaces with a single click, reducing deployment time from hours to minutes. The integration supports both text-only and multimodal chatbots, capable of processing both text and images.

Performance metrics show processing speeds of up to 358 tokens per second on unconstrained hardware. The integration could potentially increase AI deployment across organizations of varying technical expertise. To encourage adoption, SambaNova and Hugging Face will host a hackathon in December.

Read the full story here