Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More The recent release of OpenAI o1 has brought great attention to large reasoning models (LRMs), and is inspiring new models aimed at solving complex problems classic language models often struggle with. Building on the success of […]
November 26, 2024 Fabrizio Ferri-Benedetti, who spent many years as a technical writer for Splunk and New Relic, joins Ben and Ryan for a conversation about the evolving role of documentation in software development. They explore how documentation can (and should) be integrated with code, the importance of quality control, and the hurdles to maintaining […]
Tech titan Marc Benioff says we’re near the “upper limits” of LLM use in AI advancement. In a podcast, the Salesforce CEO said the future of AI lies in agents that work autonomously. “‘Terminator’? Maybe we’ll be there one day,” Benioff said, referencing the 1984 film about a cyborg assassin. Salesforce CEO Marc Benioff, in […]
A more talkative Siri might arrive as part of iOS 19 next year. Photo: Cult of Mac/Nadezhda Moryak Apple plans to further improve Siri with iOS 19 in 2025 by using advanced next-generation large language models (LLMs). Thanks to the new LLMs powering Siri, the voice assistant should become much more conversational. Siri could get […]
If you wonder how Large Language Models (LLMs) work and aren’t afraid of getting a bit technical, don’t miss [Brendan Bycroft]’s LLM Visualization. It is an interactively-animated step-by-step walk-through of a GPT large language model complete with animated and interactive 3D block diagram of everything going on under the hood. Check it out! nano-gpt has […]
OpenAI is facing diminishing returns with its latest AI model while navigating the pressures of recent investments. According to The Information, OpenAI’s next AI model – codenamed Orion – is delivering smaller performance gains compared to its predecessors. In employee testing, Orion reportedly achieved the performance level of GPT-4 after completing just 20% of its […]
Ben and Ryan are joined by Matt Zeiler, founder and CEO of Clarifai, an AI workflow orchestration platform. They talk about how the transformer architecture supplanted convolutional neural networks in AI applications, the infrastructure required for AI implementation, the implications of regulating AI, and the value of synthetic data. Source link
Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More In today’s fast-paced digital landscape, businesses relying on AI face new challenges: latency, memory usage and compute power costs to run an AI model. As AI advances rapidly, the models powering these innovations have grown increasingly […]
November 8, 2024 Or Lenchner, CEO of Bright Data, joins Ben and Ryan for a deep-dive conversation about the evolving landscape of web data. They talk through the challenges involved in data collection, the role of synthetic data in training large AI models, and how public data access is becoming more restrictive. Or also shares […]
Fine-tuning techniques: The term “fine tuning” refers to further training a pretrained model. In the case of LLMs, this means that we take a pretrained foundation model and train it some more. But, there are so many different ways that this training can be done, which makes the concept of fine tuning incredibly vague. This […]