
Magic, an AI startup creating models to generate code and automate a range of software development tasks, has raised a large tranche of cash from investors, including ex-Google CEO Eric Schmidt.
In a blog post on Thursday, Magic said that it closed a $320 million fundraising round with contributions from Schmidt, as well as Alphabetās CapitalG, Atlassian, Elad Gil, Jane Street, Nat Friedman and Daniel Gross, Sequoia and others. The funding brings the companyās total raised to nearly half a billion dollars ($465 million), catapulting it into a cohort of better-funded AI coding startups whose members include Codeium, Cognition, Poolside, Anysphere and Augment. (Interestingly, Schmidt is backing Augment, too.)
In July, Reuters reported that Magic was seeking to raise over $200 million at a $1.5 billion valuation. Evidently, the round came in above expectations, although the startupās current valuation couldnāt be ascertained; Magic was valued at $500 million in February.
Magic also on Thursday announced a partnership with Google Cloud to build two āsupercomputersā on Google Cloud Platform. The Magic-G4 will be made up of Nvidia H100 GPUs, and the Magic G5 will use Nvidiaās next-gen Blackwell chips, scheduled to come online next year. (GPUs, thanks to their ability to run many computations in parallel, are commonly used to train and serve generative AI models.)
Magic says it aims to scale the latter cluster to ātens of thousandsā of GPUs over time, and that together, the clusters will be able to achieve 160 exaflops, where one exaflop is equal to one quintillion computer operations per second.
āWe are excited to partner with Google and Nvidia to build our next-gen AI supercomputer on Google Cloud,ā Magic co-founder and CEO Eric Steinberger said in a statement. āNvidiaās [Blackwell] system will greatly improve inference and training efficiency for our models, and Google Cloud offers us the fastest timeline to scale, and a rich ecosystem of cloud services.ā
Steinberger and Sebastian De Ro co-founded Magic in 2022. In a previous interview, Steinberger told Dakidarts that he was inspired by the potential of AI at a young age; in high school, he and his friends wired up the schoolās computers for machine-learning algorithm training.
That experience planted the seeds for Steinbergerās computer science Bachelorās program at Cambridge (he dropped out after a year) and, later, his job at Meta as an AI researcher. De Ro hailed from German business process management firm FireStart, where he worked his way up to the role of CTO. Steinberger and De Ro met at the environmental volunteer organization Steinberger co-created, ClimateScience.org.
Magic develops AI-driven tools (not yet for sale) designed to help software engineers write, review, debug and plan code changes. The tools operate like an automated pair programmer, attempting to understand and continuously learn more about the context of various coding projects.
Lots of platforms do the same, including the elephant in the room GitHub Copilot. But one of Magicās innovations lies in its modelsā ultra-long context windows. It calls the modelsā architecture āLong-term Memory Network,ā or āLTMā for short.
A modelās context, or context window, refers to input data (e.g. code) that the model considers before generating output (e.g. additional code). A simple question ā āWho won the 2020 U.S. presidential election?ā ā can serve as context, as can a movie script, show or audio clip.
As context windows grow, so does the size of the documents ā or codebases, as the case may be ā being fit into them. Long context can prevent models from āforgettingā the content of recent docs and data, and from veering off topic and extrapolating wrongly.
Magic claims its latest model, LTM-2-mini, has a 100 million-token context window. (Tokens are subdivided bits of raw data, like the syllables āfan,ā ātasā and āticā in the word āfantastic.ā) One hundred million tokens is equivalent to around 10 million lines of code, or 750 novels. And itās by far the largest context window of any commercial model; the next-largest are Googleās Gemini flagship models at 2 million tokens.
Magic says that thanks to its long context, LTM-2-mini was able to implement a password strength meter for an open source project and create a calculator using a custom UI framework pretty much autonomously.
The companyās now in the process of training a larger version of that model.
Magic has a small team ā around two dozen people ā and no revenue to speak of. But itās going after a market that could be worth $27.17 billion by 2032, according to an estimate by Polaris Research, and investors perceive that to be a worthwhile (and possibly quite lucrative) endeavor.
Despite the security, copyright and reliability concerns around AI-powered assistive coding tools, developers have shown enthusiasm for them, with the vast majority of respondents in GitHubās latest poll saying that theyāve adopted AI tools in some form. Microsoft reported in April that Copilot hadĀ over 1.8 million paying users and more than 50,000 business customers.
And Magicās ambitions are grander than automating routine software development tasks. On the companyās website, it speaks of a path to AGI ā AI that can solve problems more reliably than humans can alone.
Toward such AI, San Francisco-based Magic recently hired Ben Chess, a former lead on OpenAIās supercomputing team, and plans to expand its cybersecurity, engineering, research and system engineering teams.

