News
BofA Securities analyst predicts AI to drive server market growth over next six years, benefiting Dell and HP. AI server ...
Dell's AI server demand drives a $49B backlog, signaling robust growth and a 16% upside potential. Explore why DELL stock ...
Dell computers are often a bit more expensive upfront, but they tend to hold their value well over time. HP usually has more ...
19d
Zacks Investment Research on MSNShould You Buy, Sell or Hold Dell Technologies Stock at P/S of 0.77X?Dell Technologies DELL shares are cheap, as suggested by a Value Score of A. DELL stock is trading at a significant discount with a forward 12-month P/S of 0.77X compared with the Computer and ...
The new Dell PowerEdge XE8712 server features the GB200 NVL4 platform and supports up to 144 NVIDIA B200 GPUs per Dell IR7000 rack. These liquid-cooled systems are tailored for AI model training ...
The new Dell PowerEdge XE8712 server features the GB200 NVL4 platform and supports up to 144 NVIDIA B200 GPUs per Dell IR7000 rack. These liquid-cooled systems are tailored for AI model training ...
Dell's liquid-cooled PowerEdge XE9780L and XE9785L systems are currently able to support up to 256 NVIDIA Blackwell Ultra GPUs per rack, and they allow for the cooling of these chips to ensure ...
Dell also provides a range of server solutions, including Dell PowerEdge Rack Servers, Tower Servers, Modular Infrastructure and PowerEdge C-Series.
The US Department of Energy (DOE) has selected Dell to supply the next supercomputer for Lawrence Berkeley National Laboratory in Berkeley, California. Featuring Nvidia chips, the supercomputer will ...
Dell also presented another server, the PowerEdge XE9712. It has an Nvidia GB300 NVL72, which contains 72 Blackwell Ultra GPUs and 36 ARM processors from Nvidia's Grace series.
Hosted on MSN1mon
Dell Technologies fuels enterprise AI innovation with infrastructure, solutions, services - MSNDell PowerEdge XE9785 and XE9785L servers will support AMD MI350x accelerators with 288GB of HBM3e memory per GPU and up to 35 times greater (13) inferencing performance.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results