Meta rolled out AI translation to its Ray-Ban smart glasses while startup Own.App launched a decentralized social platform promising full content ownership and higher creator payouts, intensifying competition in the rapidly evolving social media industry. PYMNTS takes a look at what’s new.
Magic Vision launched its AI-powered content creation platform on Friday (Dec. 13), promising to cut digital media production time by 70% through automated editing tools for photo, video, and audio content.
The London-based startup combines artificial intelligence with editing features like intelligent object removal, text-to-video generation and voice cloning capabilities. The platform targets creators ranging from amateur photographers to professional media agencies, addressing growing demands in digital content production.
“Magic Vision is more than just an editing tool — it’s a creative partner that evolves alongside users,” said CEO Noel Fargents, Citing the platform’s self-learning AI that adapts to workflows over time.
Key features include AI-powered photo manipulation, smart video stabilization and audio synchronization tools.
The company said it has already secured partnerships with major content agencies and media production companies. Looking ahead, Magic Vision plans to expand into virtual reality content creation and add collaboration features.
The platform is available now through magicvision.app, offering tools for creators seeking to streamline their digital media production process.
Meta announced it would begin rolling out new artificial intelligence and translation capabilities to its Ray-Ban smart glasses through a software update starting Monday (Dec. 16).
The v11 software update will give Early Access Program members in the U.S. and Canada access to two new features. “Live AI” will allow the glasses to see and discuss what users are looking at in real-time, providing hands-free assistance with tasks like meal prep and exploring new neighborhoods without requiring the “Hey Meta” wake phrase for follow-up questions.
For early access members, the glasses will also be able to translate speech in real time between English and Spanish, French or Italian — a feature CEO Mark Zuckerberg demonstrated at Connect 2024. Users will be able to hear translations through the glasses’ speakers or view them as transcripts on their phones.
The company is also adding Shazam integration for U.S. and Canadian users, enabling them to identify songs by asking, “Hey Meta, what is this song?”
Meta noted these AI features are still being tested and may not always work perfectly. The company plans to release additional features in 2025.
A new artificial intelligence platform claims it can reduce social media management time by 99.75% and cut content creation costs by up to 70% for small businesses.
Aggie, launched by data analytics firm Audience Genomics, automates the creation and scheduling of social media posts across multiple platforms. According to company executives, the tool generates a month’s worth of content in minutes.
The platform builds on Audience Genomics’ experience serving major brands like Universal Studios and Fenty Beauty. CEO Greg Weinstein says Aggie’s algorithm draws from six years of social media data from approximately 5,000 companies across 150 industries.
“Social media doesn’t have to be so overwhelming,” said Weinstein, who previously led A&E’s Digital Content Studio.
The company secured $3.2 million in funding led by SPO Capital Investments LLC. Early adopters report promising results, including skincare startup Gleem Beauty, which claims sales nearly tripled after it implemented the platform.
Aggie currently supports Instagram, LinkedIn, X, Facebook and Threads, with plans to expand to additional platforms and video content in the coming months.
The post From Translation to Creation, AI Reshapes Social Media Landscape appeared first on PYMNTS.com.