10 Trending Stories in Radical Data Science

Published On Sat Nov 09 2024
10 Trending Stories in Radical Data Science

LLM | Radical Data Science

Welcome to the AI News Briefs Bulletin Board, a timely new channel bringing you the latest industry insights and perspectives surrounding the field of AI including deep learning, large language models, generative AI, and transformers. I am working tirelessly to dig up the most timely and curious tidbits underlying the day’s most popular technologies. I know this field is advancing rapidly and I want to bring you a regular resource to keep you informed and state-of-the-art. The news bites are constantly being added in reverse date order (most recent on top). With the bulletin board, you can check back often to see what’s happening in our rapidly accelerating industry. Click HERE to check out previous “AI News Briefs” round-ups.

Latest Updates:

[11/8/2024] Did OpenAI just spend more than $10 million on a URL?

On Nov. 6, OpenAI CEO Sam Altman posted a simple URL on X: chat.com. It automatically routes to ChatGPT.

[11/8/2024] NVIDIA CEO Jensen Huang’s Special Address at AI Summit India

Watch Jensen Huang’s special address to see the role of AI in India’s digital transformation and how it’s fueling innovation, economic growth, and global leadership.

[11/8/2024] NEW research paper: “Mixtures of In-Context Learners”

In-context learning (ICL) allows LLMs to adapt without fine-tuning but suffers from increased memory complexity and inefficiency. MoICL addresses these issues by treating subsets of demonstrations as experts, showing up to 13% improvement on classification tasks.

Mixture of In-Context Learners

[11/8/2024] Andrew Ng and DeepLearning.AI announce a new course with focus on agent memory and persistent storage

Learn how to build agentic memory into your applications in this short course, LLMs as Operating Systems: Agent Memory, created in partnership with Letta, and taught by its founders Charles Packer and Sarah Wooders.

[11/7/2024] 25% of Google’s new code generated by AI

Google is now using AI to generate over 25% of its new code, which is then reviewed by engineers to enhance productivity and speed up development, as shared by CEO Sundar Pichai.

Google CEO says over 25% of new Google code is generated by AI

[11/7/2024] Microsoft releases Magnetic-One, an open-source multi-agent system to automate complex tasks with AI

Magentic-One is a new generalist multi-agent system for solving open-ended web and file-based tasks across a variety of domains, representing a significant step towards developing agents that can complete tasks encountered in work and personal lives.

[11/7/2024] IBM reveals complete dataset list used to train Granite 3.0 LLMs for unmatched transparency

Granite 3.0 language models are a new set of lightweight state-of-the-art, open foundation models that natively support multilinguality, coding, reasoning, and tool usage.