← Home AI in 15

AI in 15 — April 28, 2026

April 28, 2026 · 18m 00s
Kate

Microsoft and OpenAI just tore up the most consequential corporate marriage in tech. No more exclusive cloud. No more revenue-share strings tied to AGI. OpenAI is free to sell on AWS and Google Cloud starting now. And Microsoft, voluntarily, gave up the moat that built Azure into a frontier-AI powerhouse.

Kate

Welcome to AI in 15 for Tuesday, April 28, 2026. I'm Kate, your host.

Marcus

And I'm Marcus, your co-host.

Kate

Tuesday show, Marcus, and the headline is the rewrite of the Microsoft-OpenAI partnership. GitHub Copilot is moving to usage-based billing and the subsidized inference era is officially over. A four-terabyte breach at AI staffing firm Mercor just handed deepfake operators forty thousand turn-key impersonation kits. China killed Meta's two-billion-dollar acquisition of Manus after the deal had already closed. DeepSeek shipped V4 and now leads the world on Codeforces. And an open-source coding agent built by one developer just beat JetBrains and Google on the leading terminal benchmark.

Kate

The Microsoft-OpenAI marriage is officially open.

Kate

Copilot stops eating Microsoft's inference bill on June first.

Kate

And forty thousand voice prints plus matching IDs hit the dark web.

Kate

Lead story, Marcus. Yesterday Microsoft and OpenAI announced the biggest restructuring of their partnership since the original 2019 billion-dollar deal. Walk me through what actually changed.

Marcus

Three big things, Kate. First, Microsoft's exclusive right to sell OpenAI's models is dead. OpenAI can now ship its full product line on AWS and Google Cloud. Second, Microsoft gives up its revenue-share obligation on OpenAI products it resells through Azure. Third, the murky AGI declaration clauses, the language that previously governed when Microsoft's IP rights would expire, have been dramatically simplified and replaced with hard dates.

Kate

And the financials underneath it.

Marcus

Microsoft's stake is worth roughly a hundred and thirty-five billion dollars, about twenty-seven percent on a diluted basis after October's restructure. They keep an IP license through 2032, but it's now non-exclusive. OpenAI keeps paying Microsoft a revenue share through 2030, same percentage rate, but with a previously-undisclosed cap and no AGI contingency. Azure remains the, quote, primary cloud partner with first-shipment rights, but Microsoft can pass on supporting new OpenAI capabilities, which frees OpenAI to ship them elsewhere.

Kate

Why now? Why did Microsoft give up exclusivity voluntarily?

Marcus

They didn't, really. They were boxed in. OpenAI committed thirty-eight billion dollars to AWS in February, and Amazon put fifty billion into OpenAI. The Stargate buildout is now seven gigawatts and four hundred billion over three years. Microsoft simply couldn't supply that capacity alone, and OpenAI was already routing around them. The restructure is Satya Nadella formalizing what was already happening on the ground.

Kate

Who's the biggest winner here?

Marcus

Google, indirectly. Every other frontier lab already runs on Google's TPU 8 chips. OpenAI couldn't, because of Azure exclusivity. Now they can. The other winner is OpenAI itself. This deal cleans up the cap table, dates the obligations, kills the AGI-trigger language that scared every IPO banker on the West Coast, and clears the runway for a real public offering. The marriage isn't over, Kate. It's just no longer monogamous, and both sides clearly prefer it that way.

Kate

Quick hits, Marcus. GitHub Copilot announced yesterday that on June first, every plan converts from flat-rate premium-request allowances to a token-metered AI Credits system.

Marcus

Pro stays at ten dollars a month, includes ten dollars of credits. Pro+ at thirty-nine, thirty-nine in credits. Business at nineteen per user, nineteen in credits. Enterprise at thirty-nine per user, thirty-nine in credits. Code completions and Next Edit suggestions stay free. Everything agentic, every chat turn against a frontier model, every Copilot code review now drains the meter at published API rates. Input tokens, output tokens, and cached tokens all count.

Kate

And the model multipliers.

Marcus

This is where it stings. GPT-5 and Claude Sonnet land at roughly five to six times the base rate. Claude Opus is set at twenty-seven times. Hacker News commenters worked out that power users who were quietly burning five hundred dollars of Opus inference on a ten-dollar plan are about to get a fifty-times effective price hike.

Kate

So the free coding-intelligence era is over.

Marcus

It is, Kate. Microsoft was the largest distributor of frontier-model inference outside the labs themselves, and they were eating an enormous loss on power users running coding agents twenty-four-seven. This is the real economics of inference catching up with the consumer market. Expect a meaningful exodus to OpenRouter and direct API providers. Expect competitive pressure on per-token pricing because Copilot now passes the API price through transparently. The Azure cross-subsidy that made Copilot feel magical is being unwound in public.

Kate

Mercor, Marcus. The AI staffing firm that supplies most of the major labs with humans for data labeling. Lapsus$ posted four terabytes of stolen data on April fourth and the full scale only became visible yesterday.

Marcus

Hacker News surfaced an analysis with four hundred and ninety-four points. Per contractor, the dump contains a passport or driver's license scan, a webcam selfie, and a sit-down voice recording reading scripted prompts in a quiet room, two to five minutes of studio-clean speech each. Across forty thousand people. The original intrusion came through a supply-chain compromise of LiteLLM, the open-source LLM proxy library, attributed to a group calling itself TeamPCP. Five contractor lawsuits hit within ten days of disclosure.

Kate

How bad is the deepfake exposure?

Marcus

Catastrophic. High-quality voice cloning needs about fifteen seconds of clean audio. Mercor victims have ten times that, paired with the matching government ID. Bank voice biometrics, vishing CFOs, deepfake video calls, insurance fraud, elder-care impersonation, the whole catalog just got forty thousand ready-made targets and forty thousand ready-made impersonator templates. Biometric data can never be rotated. You can't reissue your voice the way you reissue a credit card.

Kate

And the supply-chain angle.

Marcus

That's the part the security community keeps flagging. Mercor sits underneath OpenAI, Anthropic, and Meta. LiteLLM sits underneath thousands of AI projects. Two weak links, one breach, irreversible biometric exposure. The German word the HN thread keeps using is Datensparsamkeit, data frugality. The AI industry has been collecting maximum data on minimum justification, and this is the bill arriving.

Kate

China, Marcus. Beijing's National Development and Reform Commission issued a one-line notice yesterday prohibiting foreign investment in Manus, the agentic-AI startup Meta acquired in late December.

Marcus

Two billion dollars, deal already substantially closed, Manus already integrated into Meta's internal systems, executives already on Meta payroll. Beijing has effectively ordered the unwinding. Manus was originally founded in China but relocated its corporate domicile to Singapore in mid-2025, shut its China offices, and ran its seventy-five-million-dollar Benchmark-led round entirely outside the PRC. Beijing says the deal violated Chinese export-control and national-security laws on outbound technology transfer.

Kate

So Singapore wasn't far enough away.

Marcus

That's the signal. China just demonstrated extraterritorial reach over a Singapore-based company on the basis of its Chinese founders and origin code. The, quote, move HQ to Singapore, playbook that ByteDance and dozens of Chinese startups have leaned on just got punctured. Founders working on China-origin AI tech now face real risk of being treated as PRC nationals regardless of incorporation. Second, this is China deploying its own version of US-style export controls, applied symmetrically to AI model code and agent IP. Expect every US acquirer to add China-origin diligence to AI deals, and expect Chinese-founded startups to pre-emptively divest founders or original IP before raising Western money. The silver lining is small. Beijing just made it harder for its own founders to get acquired by anyone, which is its own form of brain-drain insurance for the West.

Kate

DeepSeek V4, Marcus. We covered the launch over the weekend. New numbers came in.

Marcus

V4-Pro, one-point-six trillion parameters, forty-nine billion active, mixture-of-experts. V4-Flash, two hundred and eighty-four billion with thirteen billion active. Both support a one-million-token context. Codeforces rating of three thousand two hundred and six, surpassing GPT-5.4. SWE-bench Verified at eighty-point-six percent, within two-tenths of a point of Claude Opus 4.6. A perfect one-twenty out of one-twenty on Putnam-2025. Number two on Artificial Analysis's open-weights intelligence index, behind only Kimi K2.6. And in one-million-token mode it uses twenty-seven percent of the per-token FLOPs and ten percent of the KV cache of V3.2.

Kate

So a year after V3 cratered NVIDIA's stock, they're back at the open-weights frontier.

Marcus

At a fraction of the price, on MIT-style licensing, on Hugging Face, available globally. Kate, the strategic angle worth flagging is the release cadence and the distribution model. Free near-frontier coding models, undercut the economics of paid US alternatives, normalize PRC-trained values and refusals into the global AI dependency stack. One Hacker News commenter put it bluntly, quote, DeepSeek V4 is good enough, really really good given the price. That's exactly the response Beijing wants from Western developers. The pricing pressure on Anthropic and OpenAI just got worse at the same moment Microsoft is unwinding the cross-subsidy that made Copilot affordable. The squeeze is coordinated, intentional, and effective.

Kate

OpenAI published a new principles document Sunday, Marcus, and the AGI language quietly disappeared.

Marcus

Sam Altman posted a five-principle framework Sunday, replacing the 2018 charter. Democratization, Empowerment, Universal Prosperity, Resilience, and Adaptability. The Democratization plank says OpenAI will, quote, resist the potential of this technology to consolidate power in the hands of the few, and that decisions should flow through democratic processes, quote, not just made by AI labs. Hacker News was not gentle. The top reply pointed out that the obvious way to serve democratization would be publishing weights and research, which OpenAI has progressively stopped doing. Another reader resurfaced the old charter language committing OpenAI to stop competing and assist a value-aligned competitor that approached AGI first. That language does not survive into the new document.

Kate

And the timing.

Marcus

One day before the Microsoft restructuring announcement. The old charter had AGI declaration triggers embedded in OpenAI's governance. The Microsoft contract simplification stripped those triggers out. The new principles document strips the language out of the public-facing identity at the same time. The framing has shifted from, quote, we are a safety-first AGI lab with binding constraints, to, quote, we are a company that wants AI to be widely available. That reads more like pre-IPO risk-factor cleanup than a governance reaffirmation. There's still no commitment to refuse military or mass-surveillance contracts, which OpenAI walked back publicly in 2024. The AGI mission language is being retired in plain sight, and the only people pointing it out are Hacker News commenters.

Kate

Last technical story, Marcus. Dirac. An open-source coding agent built by a solo developer.

Marcus

Posted a sixty-five-point-two percent score on TerminalBench-2 using Gemini 3 Flash Preview. Beat Google's own official baseline at forty-seven-point-six. Beat JetBrains' closed-source Junie CLI at sixty-four-point-three. At sixty-four-point-eight percent lower cost, a two-point-eight times cost reduction. No benchmark-specific scaffolding. The trick is hash-anchored file edits with single-token Myers-diff outputs, AST-aware context selection that avoids reading whole source files, aggressive parallelism, and a tool registry where the agent decides when to call tools, not just what.

Kate

Two takeaways for listeners building anything.

Marcus

First, the harness matters more than the model. Going from forty-eight to sixty-five percent on the same Gemini 3 Flash backbone is a bigger jump than most model upgrades deliver. Second, an open-source one-person project just beat well-funded commercial agents on the most credible coding benchmark, on a non-frontier model, at a fraction of the cost. The era where the leading agentic coding tools are necessarily closed-source and venture-funded is visibly ending. Picking the right harness is now as important as picking the right model.

Kate

Last quick hit, Marcus. The labor story we've been tracking.

Marcus

Meta cutting eight thousand jobs starting May twentieth, ten percent of staff, plus six thousand more roles unfilled. Microsoft, in a first for the fifty-one-year-old company, opened voluntary buyouts to roughly seven percent of US employees, potentially another eight thousand seven hundred and fifty cuts. Total US tech layoffs in 2026 already past ninety-two thousand. The framing from executives is no longer macro slowdown. It's explicitly that Alphabet, Microsoft, Meta, and Amazon are spending close to seven hundred billion on AI capex this year and need to claw efficiency back from headcount.

Kate

So capex is being financed in part by labor cost cuts.

Marcus

That's the pattern locking in across hyperscalers. AI tools are simultaneously the cause and the budgetary justification. We need to spend on GPUs, so we save on people. For listeners, this is the most directly career-relevant story of the week. Entry-level and generalist IT hiring is visibly compressing. AI-specialized roles continue to get bid up. The middle is where the squeeze is happening.

Kate

Tuesday big picture, Marcus.

Marcus

Today's stories braid into one argument, Kate. The economics of inference and AI infrastructure are now binding. The cross-subsidies that powered the 2024-25 boom, Azure subsidizing OpenAI, Microsoft subsidizing Copilot users, hyperscalers subsidizing each other through partnership equity, are being unwound and re-priced in real time. Costs are showing up where the actual costs are. Per-token in the API meter, per-gigawatt in the Stargate budget, per-headcount on the org chart. Meanwhile China is exploiting that pricing pressure with free near-frontier open weights and locking down its own AI talent and IP from leaving the country. The race is no longer just about model quality. It's about who can sustain four-hundred-billion infrastructure builds without breaking either their balance sheet or their workforce.

Kate

That's your AI in 15 for today. See you tomorrow.