Software
AWS Unveils New Service to Battle AI Hallucinations
2024-12-03
Amazon Web Services (AWS), the cloud computing division of Amazon, is making significant strides in addressing the issue of hallucinations in AI models. At the AWS re:Invent 2024 conference in Las Vegas, a new service called Automated Reasoning Checks was announced. This service validates a model's responses by cross-referencing customer-supplied information for accuracy. AWS claims it is the "first" and "only" safeguard for hallucinations, but this claim may be somewhat exaggerated.

Similarities and Differences with Other Providers

Automated Reasoning Checks is remarkably similar to the Correction feature Microsoft rolled out this summer. It also flags AI-generated text that might be factually wrong. Google offers a similar tool in its Vertex AI platform, allowing customers to "ground" models using data from third-party providers or their own datasets.In AWS's Bedrock model hosting service, specifically the Guardrails tool, Automated Reasoning Checks attempts to determine how a model arrived at an answer and discern its correctness. Customers upload information to establish a ground truth, and the tool creates rules that can be refined and applied to the model. As the model generates responses, Automated Reasoning Checks verifies them and provides the correct answer in case of a probable hallucination.AWS states that PwC is already using Automated Reasoning Checks to design AI assistants for its clients. Swami Sivasubramanian, VP of AI and data at AWS, believes that such tooling is attracting customers to Bedrock. The customer base of Bedrock grew by 4.7 times in the last year to reach tens of thousands of customers.However, as one expert pointed out this summer, trying to eliminate hallucinations from generative AI is like trying to eliminate hydrogen from water. AI models hallucinate because they don't actually "know" anything; they are statistical systems that identify patterns and predict answers based on previously seen examples. AWS claims that Automated Reasoning Checks uses "logically accurate" and "verifiable reasoning" to arrive at its conclusions, but it has not provided any data to prove the tool's reliability.

Model Distillation: Transferring Capabilities

In other Bedrock news, AWS announced Model Distillation today. This tool allows the transfer of the capabilities of a large model (such as Llama 405B) to a smaller model (like Llama 8B), which is cheaper and faster to run. It is a response to Microsoft's Distillation in Azure AI Foundry and provides a way to experiment with different models without incurring high costs.After the customer provides sample prompts, Amazon Bedrock will generate responses and fine-tune the smaller model. It can even create more sample data if needed to complete the distillation process. However, there are some caveats. Model Distillation currently only works with Bedrock-hosted models from Anthropic and Meta. Customers must select a large and small model from the same "family," and the distilled models will lose some accuracy, according to AWS (less than 2%).

Multi-Agent Collaboration: Streamlining Projects

Also available in preview is "multi-agent collaboration," a new Bedrock feature. This allows customers to assign AI to subtasks in a larger project. As part of Bedrock Agents and contributing to the AI agent craze, multi-agent collaboration provides tools to create and tune AI for tasks such as reviewing financial records and assessing global trends.Customers can designate a "supervisor agent" to break up and route tasks automatically. The supervisor can give specific agents access to the necessary information and determine which actions can be processed in parallel and which need details from other tasks before an agent can move forward. Once all the specialized AI complete their inputs, the supervisor agent can pull the information together and synthesize the results.This feature sounds promising, but like all new technologies, we will need to see how well it performs in real-world deployments.
Celeb Greetings App Cameo Shifts to Creator Focus
2024-12-03
Celeb greetings app Cameo has embarked on a significant transformation by introducing a new product called CameoX. This move comes as the company, which initially became a pandemic hit as people marked special occasions with celebrity shout-outs, now faces new challenges in a post-pandemic world.

Unlock Creativity with CameoX

Background and Challenges

Cameo initially soared to success during the pandemic, becoming a tech unicorn valued at $1 billion. However, as life returned to normal, it faced multiple rounds of layoffs and saw its valuation decline. One of the main issues was its reliance on video messages and the inability to make other revenue-generating products stick, such as live video calls or events. It even dabbled in crypto with its NFT project, Cameo Pass.The struggle to attract A-list celebrities was just one part of the problem. Cameo needed to find a way to expand its reach and tap into new talent.

Embracing Lesser-Known Talents

With CameoX, Cameo is now embracing its status as a platform for lesser-known talents. Since May 2023, the company has been piloting the service, making it accessible to a broader creator community. Creators can now enroll themselves by filling out a form, downloading the app, and verifying their identity. This puts Cameo more in line with other creator-focused platforms like YouTube and Twitch.Over the past 18 months, CameoX has added 31,000 creators who have collectively performed over 155,000 Cameos, resulting in millions in bookings. These creators range from reality TV stars to influencers and even questionable characters like former Congressman George Santos. This shows that Cameo's strategy of targeting noteworthy individuals is paying off.

Boosting Revenue and Competing

By making Cameo less exclusive and more of a creator toolkit, the company hopes to boost its revenue. It earns a 30% share of bookings, and with more creators on board, there is potential for increased earnings. However, entering this space means competing with platforms that creators already use to monetize their fanbase, such as YouTube, Twitch, Instagram, and TikTok.Cameo CEO Steven Galanis believes that the nature of fame has changed. Digital native creators with engaged fan bases are becoming more successful, and there is a place for everyone on Cameo.To date, Cameo has generated over $310 million in talent earnings across over 8.2 million bookings, highlighting the potential of its new approach.
See More
Amazon AWS SageMaker Unifies Data Control in 2024
2024-12-03
Amazon Web Services (AWS) has been at the forefront of cloud computing for nearly a decade. Their platform, SageMaker, has been a key player in creating, training, and deploying AI models. This year, however, the focus shifted towards streamlining. At the re:Invent 2024 conference, AWS unveiled SageMaker Unified Studio, a revolutionary tool that brings together data from across the organization in one place.

Unveiling SageMaker Unified Studio

SageMaker Unified Studio is a single destination for data discovery and work. It combines tools from various AWS services, including the existing SageMaker Studio, to assist customers in preparing and processing data for model building. As Swami Sivasubramanian, VP of data and AI at AWS, stated, "We are witnessing a convergence of analytics and AI, and SageMaker Unified Studio provides the necessary tools in one place."Customers can publish and share data, models, and other artifacts within their teams or the entire organization using SageMaker Unified Studio. The service offers data security controls and adjustable permissions, along with integrations with AWS' Bedrock model development platform.AI is seamlessly integrated into SageMaker Unified Studio through Q Developer, Amazon's coding chatbot. It can answer questions such as "What data should I use to gain a better understanding of product sales?" or "Generate SQL to calculate total revenue by product category." AWS explained in a blog post that Q Developer supports development tasks like data discovery, coding, SQL generation, and data integration within SageMaker Unified Studio.

Expanding the SageMaker Product Family

In addition to SageMaker Unified Studio, AWS launched two small yet significant additions to its SageMaker product family. SageMaker Catalog allows admins to define and implement access policies for AI apps, models, tools, and data using a single permission model with granular controls. SageMaker Lakehouse, on the other hand, provides connections from SageMaker and other tools to data stored in AWS data lakes, data warehouses, and enterprise apps.AWS emphasizes that SageMaker Lakehouse works with any tools compatible with Apache Iceberg standards, an open-source format for large analytic tables. Admins have the option to apply access controls across data in all the analytics and AI tools that SageMaker Lakehouse interacts with.

Improved Integration with SaaS Applications

In a related development, SageMaker now integrates better with software-as-a-service applications. Thanks to these new integrations, SageMaker customers can access data from apps like Zendesk and SAP without the need for extraction, transformation, and loading. AWS wrote, "Customers often have data spread across multiple data lakes and a data warehouse. A simple way to unify this data would benefit them. Now, they can use their preferred analytics and machine learning tools on their data, regardless of where it is physically stored, to support various use cases including SQL analytics, ad-hoc querying, data science, machine learning, and generative AI."
See More