- The AI for E-Commerce Newsletter
- Posts
- šļø MCP for Allš„š
šļø MCP for Allš„š
AI for eCommerce Newsletter - 65
Most people hear MCP (Model Context Protocol) and their eyes do the slow fade. It sounds like something reserved for people who sleep with YAML under their pillow. But the truth is MCP is quietly becoming the new plumbing of AI workflows. If you want your models to do anything useful with your real data, youāre going to bump into it sooner than you think.
MCP and e-Commerce
Amazon just rolled out their own Ads MCP server in beta, which tells you exactly where this is heading. Platforms are racing to give AI direct access to the real world. Amazonās version slots into that trend. Itās their bridge between AI agents and the massive set of tools AWS already runs.
Early signs point to a model where a Claude or a GPT can call real AWS functions, hit real APIs, and actually perform small actions on your behalf inside your cloud stack. Itās still early, but the direction is clear. No more toy demos. Real data. Real actions. Real consequences.
Claude and BigQuery MCP Server
I recently wired up Claude desktop with Google BigQuery through the MCP server called MCP Toolbox.
Picture a tiny translator sitting between Claude and your database (in my case, BigQuery). You ask the model something. MCP handles the messy bits. Suddenly Claude is reading columns, shaping queries, and helping you reason about your data without you manually spelunking through SQL.
How to set this up?
I have to admit, it wasnāt an easy setup. There were folders, keys, configs, and a few moments where I wondered if I should just go back to spreadsheets. But once it clicked, the whole thing felt obvious. AI agents are not going to thrive in isolation. They need hooks into your systems. MCP is that hook. And you can ask AI to help you with the step-by-step.

To what end?
The fun part is what happens after the MCP server is actually connected. You stop ārunning SQL queriesā and start chatting with your data. Simple things like asking what the average is across all clients at my agency. Or the harder stuff like why did my TACoS go up last month and letting the model dig through multiple tables and assemble the answer on its own.


A year ago this would have required real engineering. Custom scripts, scheduled jobs, layers of glue code. Now you point your model at an MCP server and it starts behaving like an internal analyst who already knows where everything lives.
That is the shift. You move from writing SQL to asking better questions. And the AI handles the messy cross table detective work that used to eat half a day.
What next?
If youāve been watching this space from the shoreline, this is the moment to wade in. MCP is not hype. Itās the protocol layer that lets your models work inside your business instead of floating above it.
You donāt need to be an engineer to get the big picture. You just need to understand that the future of AI is not bigger models. Itās connective tissue. And MCP is quietly becoming the connective tissue of everything.

PixVerse: Pic to Video in no time
Video is still the heavyweight in e-commerce, but most brands get stuck at the same choke point. Shoots are slow. Edits are slower. Budgets vanish fast. PixVerse cuts straight through that friction. It turns a single product photo or a short text prompt into multiple motion clips you can actually test. Think Image to Video, Text to Video, and those viral effects that social feeds love. One shot becomes five variations in minutes.
Why it matters
Video volume decides who wins attention. PixVerse gives you volume without the cost spiral. It is perfect for quick A B tests, scroll stoppers, UGC flavored content, and proving a hook before committing to a real production. When a client hesitates on video budgets, this is the on ramp. You show results first, then scale the spend.
Use it with intention
The output is fast and usable, even if not studio grade. You will see the occasional artifact or weird transition. And because it runs on a credit model, careless testing can get expensive.

Also remember that relying on trending effects can flatten your brand into everyone elseās feed, so push the tool instead of settling for templates.
PixVerse works best as a rapid prototype engine. Use it to explore angles, validate ideas, and build early traction. Then hand the winners to your creative team for polish. That keeps your brand sharp and your testing loop fast.
I like Google VEO 3 better, but it also comes at a much higher price point!
The future of e-commerce requires scale and speed. PPC Ninja helps brands dominate the AI transition. We leverage AI to build stunning, high-converting images and video, efficiently scaling your content production across all channels (Amazon Ads, Social Media, Posts). Stop fighting for relevance. Reach out to [email protected] to explore how we can immediately upgrade your content and future-proof your listings.

FFMPEG Use cases
I recorded the entire Amazon unBoxed 2025 keynote address (1.5 hours) on my iPhone audio recorder. Later, I asked ChatGPT for an ffmpeg command that would shrink the 50MB+ audio file to the smallest file size possible. A few seconds later I had a tiny 500KB version saved. Simple, direct, efficient. That is ffmpeg in a nutshell.

I then uploaded that tiny file into Google Gemini and had it transcribed and turned into a LinkedIn post with my key conference updates. This is what I call efficiency in the AI era.
So, what is FFmpeg?
A command line tool that reshapes audio and video with surgical precision. Trim it, compress it, convert it, extract it. If your media workflow feels clunky, FFmpeg often has a cleaner way through it. Most web based audio and video editors rely on it behind the scenes. You just never see it working.
Five ideas for using FFmpeg
1) Audio compression
Takes a heavyweight recording and turns it into something light enough for quick sharing without trashing the quality.
ffmpeg -i input.m4a -b:a 64k output.m4a2) Format switching
When an editor complains about a file, FFmpeg just converts it and moves on.
ffmpeg -i input.mp4 output.mp33) GIF creation
Cuts a small moment from video and loops it into a tight, shareable snippet.
ffmpeg -i input.mp4 -ss 00:00:02 -t 3 -vf "fps=12,scale=480:-1" output.gif4) Video to audio extraction
Saves only the voice or soundtrack so you carry the message without the full video weight.
ffmpeg -i input.mp4 output.mp35) Silence trimming
Removes the dead space at the start and end of audio so it feels intentional from the first second.
ffmpeg -i input.wav -af silenceremove=start_periods=1:start_threshold=-50dB:stop_periods=1:stop_threshold=-50dB output.wav
RUFUS got some prominent real estate
Perhaps you noticed it already but RUFUS just grabbed more real estate on Amazon and it is acting a lot more like a true agent than a simple search feature. If you have used tools like Atlas or Comet, you know the pattern. The assistant sits on the side, thinking while you browse. Amazon just adopted that same layout.

Now RUFUS lives in the sidebar and you can watch it work through real steps. It checks your past searches, looks at your cart, scans your lists, and pulls products together. This is not filtering. This is an assistant taking actions inside the shopping flow.
What makes this shift interesting is that none of the outside players can match it. Not ChatGPT Shopping. Not Perplexity. Not Copilot. Not Google. They are all guessing from the outside. RUFUS sits inside the actual session with access to cart signals, account behavior, and the full catalog.
Amazon released their AI Creative Studio at unBoxed 2025
Amazon launched the Gen AI Creative Studio last year, and now they are introducing their creative agent. It is still early days, but you can see where they want this to go.

In the Unboxed keynote, they walked through how a brand could build an STV ad using only AI. The output showed what youād expect from a first generation system. It had the core pieces in place and you could see the direction, even if the polish is yet to come.

Amazon framed these as ābetter than nothingā, which I took less as a verdict and more as a signal of intent. This is a starting point. A way to move faster on drafts, concept tests, and early explorations before investing in fully produced assets.
And all of this is happening while audiences are paying closer attention to how AI shows up in creative work. The recent criticism around Coca Colaās holiday campaign is a good example. People care, and brands are learning how to navigate that.
The upside is still real. Faster cycles, cheaper experimentation, more creative volume. The trick is using AI where it helps and keeping humans where the stakes are high. These tools will improve quickly. The brands that stay thoughtful through the transition will benefit the most.
We hope you liked this edition of the AI for E-Commerce Newsletter! Hit reply and let us know what you think! Thank you for being a subscriber! Know anyone who might be interested to receive this newsletter? Share it with them and they will thank you for it! š Ritu
Find your customers on Roku this Black Friday
As with any digital ad campaign, the important thing is to reach streaming audiences who will convert. To that end, Rokuās self-service Ads Manager stands ready with powerful segmentation and targeting options. After all, you know your customers, and we know our streaming audience.
Worried itās too late to spin up new Black Friday creative? With Roku Ads Manager, you can easily import and augment existing creative assets from your social channels. We also have AI-assisted upscaling, so every ad is primed for CTV.
Once youāve done this, then you can easily set up A/B tests to flight different creative variants and Black Friday offers. If youāre a Shopify brand, you can even run shoppable ads directly on-screen so viewers can purchase with just a click of their Roku remote.
Bonus: weāre gifting you $5K in ad credits when you spend your first $5K on Roku Ads Manager. Just sign up and use code GET5K. Terms apply.




Reply