- The Neural Frontier
- Posts
- Google doubles down on AI in its Android Show!
Google doubles down on AI in its Android Show!
Also: Notion is becoming an OS for AI agents, while Amazon unveils a new AI-powered shopping assistant šļø.
Attio is the AI CRM for high-growth teams.
Connect your email, calls, product data and more, and Attio instantly builds your CRM with enriched data and complete context. Whether youāre running product-led growth or enterprise sales, Attio adapts to your unique GTM motion.
Then Ask Attio to plan your next move.
Run deep web research on prospects. Update your pipeline as you work. Find customers and draft outreach emails. Powered by Universal Context, Attio's intelligence layer, Attio searches, updates, and creates across your data to accelerate your workflow.
Ask more from your CRM.

Spot the trend in the headlines? Yep, everyoneās doubling down on their AI bets.
Forward thinkers, hello and welcome to issue #158 of the Neural Frontier.
If those headlines arenāt enough to get you pumped, we donāt know what will š.
Letās dive into the big moves (or bets?) from Google, Notion, and Amazon.
In a rush? Here's your quick byte:
š¤ Google doubles down on AI in its Android Show!
š» Notion is becoming an OS for AI agents.
š Amazon unveils a new AI-powered shopping assistant!
ā” The Neural Frontierās weekly spotlight: 3 AI tools making the rounds this week.

Source: Google
At its Android Show ahead of I/O 2026, Google unveiled a flood of new Android and Gemini updates, from AI-powered laptops to vibe-coded widgets and more autonomous Gemini features.
The biggest theme across everything? Google wants Gemini embedded into every layer of the Android experience.
š» Meet Googlebook
Google officially introduced Googlebook, a new line of laptops built around Gemini Intelligence from the ground up.
The devices, launching this fall with partners like Acer, Asus, Dell, HP, and Lenovo, will include:
āMagic Pointer,ā a Gemini-powered cursor
Deep Android phone integration
AI-generated custom widgets
More proactive Gemini assistance across workflows
Itās basically Googleās clearest attempt yet to create an AI-native computing experience that rivals what OpenAI, Microsoft, and Apple are increasingly moving toward.
š¤ Gemini is becoming more agentic across Android
A lot of the announcements focused less on āchatbot AIā and more on AI that actually does things.
Google showed Gemini taking information from one app and completing tasks across others, building shopping carts from grocery lists, finding events from photos of flyers, and navigating websites and booking things automatically through Chrome.
Android users are also getting Gemini in Chrome, including experimental auto-browsing features that can complete actions on your behalf.
This puts Google much closer to the broader industry shift toward AI agents becoming the interface itself.
šØ Android is getting more customizable, too
One of the more interesting launches was Create My Widget, which lets users āvibe-codeā custom Android widgets using natural language prompts.
Example: āSuggest three high-protein meal prep recipes every week.ā
Gemini then generates a functional widget dashboard for your home screen automatically.
Google also announced:
Refreshed 3D emojis
New creator tools for recording screen + face reactions simultaneously
Better Instagram integrations for Android cameras
āRambler,ā a smarter Gboard dictation system that cleans up filler words automatically
š Android Auto is becoming a lot smarter
Google is also heavily upgrading Android Auto with personalized widgets, Gemini voice assistance, better media interfaces, and full HD video playback in supported vehicles later this year.
Drivers will even be able to place food orders hands-free through integrations like DoorDash.
The bigger takeaway? Last yearās Android announcements were mostly: āhereās AI inside Android.ā
And this year feels more like: āAndroid itself is becoming an AI operating system.ā
Google is clearly trying to make Gemini proactive instead of reactive, embedded instead of separate, and agentic instead of conversational.
And with OpenAI, Anthropic, Microsoft, and Perplexity all racing toward AI-native operating systems and agents, this feels like Google fully signaling that Android is now one of the companyās biggest AI battlegrounds.

Source: Notion
Notion Labs just made its biggest AI move yet, and itās much bigger than note-taking.
At a livestreamed event, the company unveiled a new developer platform designed to turn Notion into a central hub where:
AI agents
live company data
workflows
and custom automation
ā¦can all work together inside the same workspace.
The announcement pushes Notion deeper into the rapidly growing āagentic AIā race, where companies are trying to build systems that actually coordinate and execute work.
āļø Whatās new?
The biggest addition is something called Workers, a cloud-based sandbox where teams can deploy custom code directly inside Notion.
That means companies can now sync external databases into Notion, trigger workflows with webhooks, build custom tools and logic, connect AI agents to internal systems, and automate multistep tasks across apps.
And notably, you donāt even necessarily need to write the code yourself. Notion explicitly says AI coding agents can generate and deploy much of this logic for users.
š¤ Notion now supports external AI agents too
Previously, Notionās Custom Agents mostly operated inside the Notion ecosystem. Now the company is opening the doors much wider.
Users can:
chat directly with external AI agents
assign them work inside Notion
monitor progress as if they were native teammates
Supported launch partners include Cursor, Anthropic Claude Code, OpenAI Codex, and Decagon AI.
Thereās also a new External Agent API for companies that want to connect their own internal agents to Notion workflows.
š Live data becomes part of the workspace
One of the more important updates is database syncing. Notion can now continuously pull in live information from external systems like Salesforce, Zendesk, Postgres, and other API-connected databases.
CEO Ivan Zhao described it as turning: āyour Notion database into a shared canvas powering both workflows and agents.ā Thatās a meaningful shift.
Instead of static docs and wikis, Notion increasingly wants to become a live operational layer for enterprise AI systems.
Overall, this announcement moves Notion closer to competing with workflow automation platforms, internal tooling systems, orchestration layers, and even lightweight enterprise operating systems.

Source: Amazon
Amazon just launched āAlexa for Shopping,ā a new AI-powered assistant that turns the search bar itself into a personalized shopping agent.
And unlike Rufus, Amazonās earlier AI shopping chatbot, this version is designed to do much more than answer product questions.
It wants to learn your habits, automate purchases, track prices, shop across other retailers, and eventually handle parts of shopping for you.
š¤ What Alexa for Shopping can do
The assistant is powered by Alexa+ and is now rolling out to U.S. customers across mobile, desktop, and Echo Show devices. Users can ask conversational questions like: āWhatās a good skincare routine for men?ā or āWhen did I last order AA batteries?ā
But the bigger shift is how proactive the system is becoming. Alexa for Shopping can:
compare products
generate shopping guides
schedule recurring purchases
monitor prices
automatically add items to your cart when conditions are met
Example:
āAdd this sunscreen to my cart if it drops to $10.ā Thatās less āAI searchā and more automated purchasing infrastructure.
š Amazon also wants the assistant to shop outside Amazon
One of the more ambitious additions is Amazonās āBuy for Meā feature. If a product isnāt available on Amazon itself, the assistant can:
search other online stores
navigate external retailers
and complete purchases on your behalf
Thatās a pretty major escalation in the broader AI-agent race. Because increasingly, the battleground isnāt which chatbot gives the best answer, but which AI system actually executes tasks end-to-end.
This launch makes it even clearer that Amazon sees AI as the future operating layer for commerce.
Not just product recommendations or customer support, but:
purchasing
replenishment
comparison shopping
and decision-making itself
And unlike many AI companies still experimenting with consumer monetization, Amazon already owns payments, fulfillment, logistics, shopping history, subscriptions, and delivery infrastructure.
This gives it a huge advantage in turning AI agents into actual commercial behavior.
ā” The Neural Frontierās weekly spotlight: 3 AI tools making the rounds this week.
1. š§āš» Open Vibe is a free, open-source toolkit that turns AI coding agents like Claude Code into interactive SaaS tutors, so you actually understand what you're building as you build it.
2. š® ContentPilots is an AI clipping tool that turns full-length Twitch, Kick, and YouTube streams into ready-to-post vertical clips for Shorts, Reels, and TikTok.
3. š¬ Hoogly is an AI-powered employee engagement platform that replaces static surveys with real conversations, surfacing honest team sentiment and turning it into actionable leadership insights.
Wrapping upā¦
Last week, we had a LOT of drama. And this week? Weāve had a ton of product releases. Thatās pretty much how the pendulum swings these days.
Not that weāre complaining š, of course.
As always, weāll be on the frontlines, scouting for the latest and greatest in the AI space, and bringing it straight to your inbox.
Weāll catch you next week on the Neural Frontier š!
PS: If youāre the āfriendā this mail was forwarded to, and you enjoyed it, hit the Subscribe button to see more content like this every week š.

