In July, when Elon Musk posted photos from inside xAI’s new data center in Memphis, Tennessee, he focused not on the expensive Nvidia server racks but on the thousands of neatly organized purple cables connecting them. Those cables are the signature product of Credo, a 17-year-old semiconductor company that is quietly emerging as a key supplier in the artificial intelligence boom.
Wall Street has taken notice. Credo’s shares have more than doubled this year, following a 245% surge in 2024 that has pushed its market capitalization from $1.4 billion at its 2022 IPO to nearly $25 billion. The company is successfully positioning itself as a vital downstream beneficiary of the trillion-dollar expansion in AI infrastructure.
On Friday, Credo’s stock climbed 5% after JPMorgan Chase initiated coverage with a buy rating and a $165 price target. Analysts at the bank noted that the market for active electrical cables (AECs), which Credo pioneered, is projected to reach $4 billion by 2028, driven by massive data center investments from hyperscalers like Amazon, Microsoft, Meta, and xAI. They predict Credo will see annualized revenue growth of at least 50% through 2028.
Credo’s financial performance already reflects this demand. In fiscal 2025, which ended in May, revenue more than doubled to $436.8 million, and the company swung to a net income of $52.2 million from a $28.4 million loss the previous year. According to LSEG, analysts expect sales to nearly double again in fiscal 2026 to almost $1 billion.
The company’s purple AECs, which cost between $300 and $500 each, are sturdy copper cables with chips embedded in the connectors on each end. These chips use sophisticated algorithms to maintain signal integrity over longer distances than traditional copper, with Credo’s longest cables reaching seven meters.
This technology is crucial as AI server architecture evolves. While previous servers typically had one or two processors, modern AI systems can feature eight or more GPUs on a single board, with Nvidia’s latest racks integrating up to 72 GPUs. Each GPU requires a dedicated connection to a network switch, dramatically increasing the cabling density per rack.
“In the past, Credo’s opportunity was one cable per server, but now Credo’s opportunity is nine cables per server,” said Alan Weckel, an analyst at 650 Group, who estimates Credo holds an 88% share of the AEC market.
According to CEO Bill Brennan, hyperscalers are choosing Credo’s AECs over fiber optic alternatives for their superior reliability. He explained that customers want to avoid “link flaps,” where a failed optical connection can take a portion of an AI cluster offline, wasting hours of valuable GPU processing time. “It can literally shut down an entire data center,” Brennan said.
While Credo does not officially name its largest clients, analysts cite Amazon and Microsoft as customers. A recent LinkedIn post by Amazon Web Services CEO Matt Garman appeared to show the purple cables in the company’s AI hardware racks. At a recent industry conference, a Meta-designed server rack also prominently featured Credo’s products. The company expects three or four clients to each constitute over 10% of its revenue soon.
Founded in 2008 by former Marvell engineers, Credo initially focused on high-speed chip-to-chip connection technology. The AEC business only gained significant traction with the recent AI explosion. Now, with what Brennan calls “insatiable demand from the AI cluster world,” Credo is leveraging its market-leading position to expand into new product lines, including optical transceivers and software, to further capitalize on the build-out of next-generation data centers.
Source link




