Intel Launches 4th Gen Xeon Scalable Processors, Max Series CPUs

Intel high­lights broad indus­try adop­ti­on across all major CSPs, OEMs, ODMs and ISVs, and show­ca­ses increased per­for­mance in AI, net­wor­king and high per­for­mance computing.

NEWS HIGHLIGHTS
 

  • Expan­si­ve cus­to­mer and part­ner adop­ti­on from AWS, Cis­co, Clou­de­ra, Core­Wea­ve, Dell Tech­no­lo­gies, Drop­box, Erics­son, Fuji­tsu, Goog­le Cloud, Hew­lett Packard Enter­pri­se, IBM Cloud, Inspur Infor­ma­ti­on, IONOS, Leno­vo, Los Ala­mos Natio­nal Labo­ra­to­ry, Micro­soft Azu­re, NVIDIA, Ora­cle Cloud, OVH­cloud, phoe­nix­NAP, Red­Hat, SAP, Super­Mi­cro, Tele­fo­ni­ca and VMware, among others.
  • With the most built-in acce­le­ra­tors of any CPU in the world for key workloads such as AI, ana­ly­tics, net­wor­king, secu­ri­ty, sto­rage and high per­for­mance com­pu­ting (HPC), 4th Gen Intel Xeon Sca­lable and Intel Max Series fami­lies deli­ver lea­der­ship per­for­mance in a pur­po­se-built workload-first approach.
  • 4th Gen Intel Xeon Sca­lable pro­ces­sors are Intel’s most sus­tainable data cen­ter pro­ces­sors, deli­ve­ring a ran­ge of fea­tures for opti­mi­zing power and per­for­mance, making opti­mal use of CPU resour­ces to help achie­ve cus­to­mers’ sus­taina­bi­li­ty goals.
  • When com­pared with pri­or gene­ra­ti­ons, 4th Gen Xeon cus­to­mers can expect a 2.9x1 avera­ge per­for­mance per watt effi­ci­en­cy impro­ve­ment for tar­ge­ted workloads when uti­li­zing built-in acce­le­ra­tors, up to 70-watt2 power savings per CPU in opti­mi­zed power mode with mini­mal per­for­mance loss for sel­ect workloads and a 52% to 66% lower total cost of owner­ship (TCO)3.

SANTA CLARA, Calif., Jan. 10, 2023 – Intel today mark­ed one of the most important pro­duct laun­ches in com­pa­ny histo­ry with the unvei­ling of 4th Gen Intel® Xeon® Sca­lable pro­ces­sors (code-named Sap­phi­re Rapids), the Intel® Xeon® CPU Max Series (code-named Sap­phi­re Rapids HBM) and the Intel® Data Cen­ter GPU Max Series (code-named Pon­te Vec­chio), deli­ve­ring for its cus­to­mers a leap in data cen­ter per­for­mance, effi­ci­en­cy, secu­ri­ty and new capa­bi­li­ties for AI, the cloud, the net­work and edge, and the world’s most powerful supercomputers.

Working along­side its cus­to­mers and part­ners with 4th Gen Xeon, Intel is deli­ve­ring dif­fe­ren­tia­ted solu­ti­ons and sys­tems at sca­le to tack­le their big­gest com­pu­ting chal­lenges. Intel’s uni­que approach to pro­vi­ding pur­po­se-built, workload-first acce­le­ra­ti­on and high­ly opti­mi­zed soft­ware tun­ed for spe­ci­fic workloads enables the com­pa­ny to deli­ver the right per­for­mance at the right power for opti­mal over­all total cost of ownership.

Press Kit: 4th Gen Xeon Sca­lable Processors

Addi­tio­nal­ly, as Intel’s most sus­tainable data cen­ter pro­ces­sors, 4th Gen Xeon pro­ces­sors deli­ver cus­to­mers a ran­ge of fea­tures for mana­ging power and per­for­mance, making the opti­mal use of CPU resour­ces to help achie­ve their sus­taina­bi­li­ty goals.

The launch of 4th Gen Xeon Sca­lable pro­ces­sors and the Max Series pro­duct fami­ly is a pivo­tal moment in fue­ling Intel’s tur­n­around, reig­ni­ting our path to lea­der­ship in the data cen­ter and gro­wing our foot­print in new are­nas,” said San­dra Rive­ra, Intel exe­cu­ti­ve vice pre­si­dent and gene­ral mana­ger of the Data Cen­ter and AI Group. “Intel’s 4th Gen Xeon and the Max Series pro­duct fami­ly deli­ver what cus­to­mers tru­ly want – lea­der­ship per­for­mance and relia­bi­li­ty within a secu­re envi­ron­ment for their real-world requi­re­ments – dri­ving fas­ter time to value and powe­ring their pace of innovation.”

Unli­ke any other data cen­ter pro­ces­sor on the mar­ket and alre­a­dy in the hands of cus­to­mers today, the 4th Gen Xeon fami­ly great­ly expands on Intel’s pur­po­se-built, workload-first stra­tegy and approach. 

Lea­ding Per­for­mance and Sus­taina­bi­li­ty Bene­fits with the Most Built-In Acceleration

Today, the­re are over 100 mil­li­on Xeons instal­led in the mar­ket – from on-prem ser­vers run­ning IT ser­vices, inclu­ding new as-a-ser­vice busi­ness models, to net­wor­king equip­ment mana­ging Inter­net traf­fic, to wire­less base sta­ti­on com­pu­ting at the edge, to cloud services.

Buil­ding on deca­des of data cen­ter, net­work and intel­li­gent edge inno­va­ti­on and lea­der­ship, new 4th Gen Xeon pro­ces­sors deli­ver lea­ding per­for­mance with the most built-in acce­le­ra­tors of any CPU in the world to tack­le cus­to­mers’ most important com­pu­ting chal­lenges across AI, ana­ly­tics, net­wor­king, secu­ri­ty, sto­rage and HPC.

 

When com­pa­ring with pri­or gene­ra­ti­ons, 4th Gen Intel Xeon cus­to­mers can expect a 2.9x1 avera­ge per­for­mance per watt effi­ci­en­cy impro­ve­ment for tar­ge­ted workloads when uti­li­zing built-in acce­le­ra­tors, up to 70-watt2 power savings per CPU in opti­mi­zed power mode with mini­mal per­for­mance loss, and a 52% to 66% lower TCO3.

Sus­taina­bi­li­ty

The expan­si­ve­ness of built-in acce­le­ra­tors included in 4th Gen Xeon means Intel deli­vers plat­form-level power savings, les­sening the need for addi­tio­nal dis­crete acce­le­ra­ti­on and hel­ping our cus­to­mers achie­ve their sus­taina­bi­li­ty goals. Addi­tio­nal­ly, the new Opti­mi­zed Power Mode can deli­ver up to 20% socket power savings with a less than 5% per­for­mance impact for sel­ec­ted workloads11. New inno­va­tions in air and liquid coo­ling redu­ce total data cen­ter ener­gy con­sump­ti­on fur­ther; and for the manu­fac­tu­ring of 4th Gen Xeon, it’s been built with  90% or more rene­wa­ble elec­tri­ci­ty at Intel sites with sta­te-of-the-art water recla­ma­ti­on facilities.

Arti­fi­ci­al Intelligence

In AI, and com­pared to pre­vious gene­ra­ti­on, 4th Gen Xeon pro­ces­sors achie­ve up to 10x5,6 hig­her PyTorch real-time infe­rence and trai­ning per­for­mance with built-in Intel® Advan­ced Matrix Exten­si­on (Intel® AMX) acce­le­ra­tors. Intel’s 4th Gen Xeon unlocks new levels of per­for­mance for infe­rence and trai­ning across a wide breadth of AI workloads. The Xeon CPU Max Series expands on the­se capa­bi­li­ties for natu­ral lan­guage pro­ces­sing, with cus­to­mers see­ing up to a 20x12 speed-up on lar­ge lan­guage models. With the deli­very of Intel’s AI soft­ware suite, deve­lo­pers can use their AI tool of choice, while incre­asing pro­duc­ti­vi­ty and spee­ding time to AI deve­lo­p­ment. The suite is por­ta­ble from the work­sta­tion, enab­ling it to sca­le out in the cloud and all the way out to the edge. And it has been vali­da­ted with over 400 machi­ne lear­ning and deep lear­ning AI models across the most com­mon AI uses cases in every busi­ness segment. 

Net­wor­king

4th Gen Xeon offers a fami­ly of pro­ces­sors spe­ci­fi­cal­ly opti­mi­zed for high-per­for­mance, low-laten­cy net­work and edge workloads. The­se pro­ces­sors are a cri­ti­cal part of the foun­da­ti­on dri­ving a more soft­ware-defi­ned future for indus­tries ran­ging from tele­com­mu­ni­ca­ti­ons and retail to manu­fac­tu­ring and smart cities. For 5G core workloads, built-in acce­le­ra­tors help increase through­put and decrease laten­cy, while advan­ces in power manage­ment enhan­ce both the respon­si­ve­ness and the effi­ci­en­cy of the plat­form. And, when com­pared to pre­vious gene­ra­ti­ons, 4th Gen Xeon deli­vers up to twice the vir­tua­li­zed radio access net­work (vRAN) capa­ci­ty wit­hout incre­asing power con­sump­ti­on. This enables com­mu­ni­ca­ti­ons ser­vice pro­vi­ders to dou­ble the per­for­mance-per-watt to meet their cri­ti­cal per­for­mance, sca­ling and ener­gy effi­ci­en­cy needs.

High Per­for­mance Computing

4th Gen Xeon and the Intel Max Series pro­duct fami­ly bring a sca­lable, balan­ced archi­tec­tu­re that inte­gra­tes CPU and GPU with oneAPI’s open soft­ware eco­sys­tem for deman­ding com­pu­ting workloads in HPC and AI, sol­ving the world’s most chal­len­ging problems.

The Xeon CPU Max Series is the first and only x86-based pro­ces­sor with high band­width memo­ry, acce­le­ra­ting many HPC workloads wit­hout the need for code chan­ges. The Intel Data Cen­ter GPU Max Series is Intel’s hig­hest-den­si­ty pro­ces­sor and will be available in seve­ral form fac­tors that address dif­fe­rent cus­to­mer needs.

The Xeon CPU Max Series offers 64 giga­bytes of high band­width memo­ry (HBM2e) on the packa­ge, signi­fi­cant­ly incre­asing data through­put for HPC and AI workloads. Com­pared with top-end 3rd Gen Intel® Xeon® Sca­lable pro­ces­sors, the Xeon CPU Max Series pro­vi­des up to 3.7 times10 more per­for­mance on a ran­ge of real-world appli­ca­ti­ons like ener­gy and earth sys­tems modeling.

Fur­ther, the Data Cen­ter GPU Max Series packs over 100 bil­li­on tran­sis­tors into a 47-tile packa­ge, brin­ging new levels of through­put to chal­len­ging workloads like phy­sics, finan­cial ser­vices and life sci­en­ces. When pai­red with the Xeon CPU Max Series, the com­bi­ned plat­form achie­ves up to 12.8 times13 grea­ter per­for­mance than the pri­or gene­ra­ti­on when run­ning the LAMMPS mole­cu­lar dyna­mics simulator.

Most Fea­ture-Rich and Secu­re Xeon Plat­form Yet

Signi­fy­ing the big­gest plat­form trans­for­ma­ti­on Intel has deli­ver­ed, not only is 4th Gen Xeon a mar­vel of acce­le­ra­ti­on, but it is also an achie­ve­ment in manu­fac­tu­ring, com­bi­ning up to four Intel 7‑built tiles on a sin­gle packa­ge, con­nec­ted using Intel EMIB (embedded mul­ti-die inter­con­nect bridge) pack­a­ging tech­no­lo­gy and deli­ve­ring new fea­tures inclu­ding increased memo­ry band­width with DDR5, increased I/O band­width with PCIe5.0 and Com­pu­te Express Link (CXL) 1.1 interconnect. 

At the foun­da­ti­on of it all is secu­ri­ty. With 4th Gen Xeon, Intel is deli­ve­ring the most com­pre­hen­si­ve con­fi­den­ti­al com­pu­ting port­fo­lio of any data cen­ter sili­con pro­vi­der in the indus­try, enhan­cing data secu­ri­ty, regu­la­to­ry com­pli­ance and data sove­reig­n­ty. Intel remains the only sili­con pro­vi­der to offer appli­ca­ti­on iso­la­ti­on for data cen­ter com­pu­ting with Intel® Soft­ware Guard Exten­si­ons (Intel® SGX), which pro­vi­des today’s smal­lest attack sur­face for con­fi­den­ti­al com­pu­ting in pri­va­te, public and cloud-to-edge envi­ron­ments. Addi­tio­nal­ly, Intel’s new vir­tu­al-machi­ne (VM) iso­la­ti­on tech­no­lo­gy, Intel® Trust Domain Exten­si­ons (Intel® TDX), is ide­al for port­ing exis­ting appli­ca­ti­ons into a con­fi­den­ti­al envi­ron­ment and will debut with Micro­soft Azu­re, Ali­baba Cloud, Goog­le Cloud and IBM Cloud.

Final­ly, the modu­lar archi­tec­tu­re of 4th Gen Xeon allows Intel to offer a wide ran­ge of pro­ces­sors across near­ly 50 tar­ge­ted SKUs for cus­to­mer use cases or appli­ca­ti­ons, from main­stream gene­ral-pur­po­se SKUs to pur­po­se-built SKUs for cloud, data­ba­se and ana­ly­tics, net­wor­king, sto­rage, and sin­gle-socket edge use cases. The 4th Gen Xeon pro­ces­sor fami­ly is On Demand-capa­ble and varies in core count, fre­quen­cy, mix of acce­le­ra­tors, power enve­lo­pe and memo­ry through­put as is appro­pria­te for tar­get use cases and form fac­tors addres­sing cus­to­mers’ real-world requirements.

SKU TABLE: SKUs for 4th Gen Xeon and Intel Xeon CPU Max Series

 

¹ Geo­me­an of fol­lo­wing workloads:  Rock­sDB (IAA vs ZTD), Click­House (IAA vs ZTD), SPDK lar­ge media and data­ba­se request pro­xies (DSA vs out of box), Image Clas­si­fi­ca­ti­on Res­Net-50 (AMX vs VNNI), Object Detec­tion SSD-Res­Net-34 (AMX vs VNNI), QAT­zip (QAT vs zlib)

² 1‑node, Intel Refe­rence Vali­da­ti­on Plat­form, 2x Intel® Xeon 8480+ (56C, 2GHz, 350W TDP), HT On,  Tur­bo ON,  Total Memo­ry: 1 TB (16 slots/ 64GB/ 4800 MHz), 1x P4510 3.84TB NVMe PCIe Gen4 dri­ve, BIOS: 0091.D05, (ucode:0x2b0000c0),  Cent­OS Stream 8, 5.15.0‑spr.bkc.pc.10.4.11.x86_64, Java Perf/Watt w/ openjdk-11+28_linux-x64_bin, 112 ins­tances, 1550MB Initial/Max heap size, Tes­ted by Intel as of Oct 2022.

³ ResNet50 Image Classification

New Con­fi­gu­ra­ti­on: 1‑node, 2x pre-pro­duc­tion 4th Gen Intel® Xeon® Sca­lable 8490H pro­ces­sor (60 core) with Intel® Advan­ced Matrix Exten­si­ons (Intel AMX),  on pre-pro­duc­tion Super­Mi­cro SYS-221H-TNR with 1024GB DDR5 memo­ry (16x64 GB), micro­code 0x2b0000c0, HT On, Tur­bo On, SNC Off, Cent­OS Stream 8, 5.19.16–301.fc37.x86_64, 1x3.84TB P5510 NVMe, 10GbE x540-AT2, Intel TF 2.10, AI Model=Resnet 50 v1_5, best scores achie­ved: BS1 AMX 1 core/instance (max. 15ms SLA),  using phy­si­cal cores, tes­ted by Intel Novem­ber  2022. Base­line: 1‑node, 2x pro­duc­tion 3rd Gen Intel Xeon Sca­lable 8380 Pro­ces­sor ( 40 cores) on Super­Mi­cro SYS-220U-TNR , DDR4 memo­ry total 1024GB (16x64 GB), micro­code 0xd000375, HT On, Tur­bo On, SNC Off, Cent­OS Stream 8, 5.19.16–301.fc37.x86_64, 1x3.84TB P5510 NVMe, 10GbE x540-AT2, Intel TF 2.10, AI Model=Resnet 50 v1_5, best scores achie­ved: BS1 INT8 2 cores/instance (max. 15ms SLA), using phy­si­cal cores, tes­ted by Intel Novem­ber 2022.

For a 50 ser­ver fleet of 3rd Gen Xeon 8380 (RN50 w/DLBoost), esti­ma­ted as of Novem­ber 2022: 

CapEx cos­ts: $1.64M​

OpEx cos­ts (4 year, includes power and coo­ling uti­li­ty cos­ts, infra­struc­tu­re and hard­ware main­ten­an­ce cos­ts): $739.9K​

Ener­gy use in kWh (4 year, per ser­ver): 44627, PUE 1.6​

Other assump­ti­ons:  uti­li­ty cost $0.1/kWh,  kWh to kg CO2 fac­tor 0.42394 

For a 17 ser­ver fleet of 4th Gen Xeon 8490H (RN50 w/AMX), esti­ma­ted as of Novem­ber 2022: 

CapEx cos­ts: $799.4K​

OpEx cos­ts (4 year, includes power and coo­ling uti­li­ty cos­ts, infra­struc­tu­re and hard­ware main­ten­an­ce cos­ts): $275.3K​

Ener­gy use in kWh (4 year, per ser­ver): 58581, PUE 1.6

AI — 55% lower TCO by deploy­ing fewer 4th Gen Intel® Xeon® pro­ces­sor-based ser­vers to meet the same per­for­mance requi­re­ment. See [E7] at intel.com/processorclaims: 4th Gen Intel Xeon Sca­lable pro­ces­sors. Results may vary.

Data­ba­se — 52% lower TCO by deploy­ing fewer 4th Gen Intel® Xeon® pro­ces­sor-based ser­vers to meet the same per­for­mance requi­re­ment. See [E8] at intel.com/processorclaims: 4th Gen Intel Xeon Sca­lable pro­ces­sors. Results may vary.

HPC — 66% lower TCO by deploy­ing fewer Intel® Xeon® CPU Max pro­ces­sor-based ser­vers to meet the same per­for­mance requi­re­ment. See [E9] at intel.com/processorclaims: 4th Gen Intel Xeon Sca­lable pro­ces­sors. Results may vary.

4 Geo­me­an of HP Lin­pack, Stream Tri­ad, SPECrate2017_fp_base est, SPECrate2017_int_base est. See [G2, G4, G6] at intel.com/processorclaims: 4th Gen Intel Xeon Scalable.

5 Up to 10x hig­her PyTorch real-time infe­rence per­for­mance with built-in Intel® Advan­ced Matrix Exten­si­ons (Intel® AMX) (BF16) vs. the pri­or gene­ra­ti­on (FP32)​

PyTorch geo­me­an of ResNet50, Bert-Lar­ge, Mas­kRCNN, SSD-Res­Ne­t34, RNN‑T, Resnext101.

6 Up to 10x hig­her PyTorch trai­ning per­for­mance with built-in Intel® Advan­ced Matrix Exten­si­ons (Intel® AMX) (BF16) vs. the pri­or gene­ra­ti­on (FP32)​

PyTorch geo­me­an of ResNet50, Bert-Lar­ge, DLRM, Mas­kRCNN, SSD-Res­Ne­t34, RNN‑T.​

7 Esti­ma­ted as of 8/30/2022 based on 4th gene­ra­ti­on Intel® Xeon® Sca­lable pro­ces­sor archi­tec­tu­re impro­ve­ments vs 3rd gene­ra­ti­on Intel® Xeon® Sca­lable pro­ces­sor at simi­lar core count, socket power and fre­quen­cy on a test sce­na­rio using Flex­RAN™ soft­ware. Results may vary.

8 Up to 95% fewer cores and 2x hig­her level 1 com­pres­si­on through­put with 4th Gen Intel Xeon Pla­ti­num 8490H using inte­gra­ted Intel QAT vs. pri­or generation.

8490H: 1‑node, pre-pro­duc­tion plat­form with 2x 4th  Gen Intel® Xeon Sca­lable Pro­ces­sor (60 core) with inte­gra­ted Intel Quick­As­sist Acce­le­ra­tor (Intel QAT), QAT device utilized=8(2 sockets acti­ve), with Total 1024GB (16x64 GB) DDR5 memo­ry, micro­code 0xf000380, HT On, Tur­bo Off, SNC Off, Ubun­tu 22.04.1 LTS, 5.15.0–47-generic, 1x 1.92TB Intel® SSDSC2KG01, QAT v20.l.0.9.1 , QAT­zip v1.0.9 , ISA‑L v2.3.0, tes­ted by Intel Sep­tem­ber 2022. 

8380: 1‑node, 2x 3rd Gen Intel Xeon Sca­lable Pro­ces­sors( 40 cores) on Coyo­te Pass plat­form, DDR4 memo­ry total 1024GB (16x64 GB), micro­code 0xd000375, HT On, Tur­bo Off, SNC Off, Ubun­tu 22.04.1 LTS, 5.15.0–47-generic, 1x 1.92TB Intel SSDSC2KG01,QAT v1.7.l.4.16,  QAT­zip v1.0.9 , ISA‑L v2.3.0, tes­ted by Intel Octo­ber 2022.

9 Up to 3x hig­her Rock­sDB per­for­mance with 4th Gen Intel Xeon Pla­ti­num 8490H using inte­gra­ted Intel IAA vs. pri­or generation.

8490H: 1‑node, pre-pro­duc­tion Intel plat­form with 2x 4th Gen Intel Xeon Sca­lable Pro­ces­sor (60 cores) with inte­gra­ted Intel In-Memo­ry Ana­ly­tics Acce­le­ra­tor (Intel IAA), HT On, Tur­bo On, Total Memo­ry 1024GB (16x64GB DDR5 4800),  micro­code 0xf000380,  1x 1.92TB INTEL SSDSC2KG01, Ubun­tu 22.04.1 LTS, 5.18.12–051812-generic, QPL v0.1.21,accel-config-v3.4.6.4, ZSTD v1.5.2, Rock­sDB v6.4.6 (db_bench), tes­ted by Intel  Sep­tem­ber 2022.

8380: 1‑node, 2x  3rd Gen Intel Xeon Sca­lable Pro­ces­sors( 40 cores) on Coyo­te Pass plat­form, HT On, Tur­bo On, SNC Off, Total Memo­ry 1024GB (16x64GB DDR4 3200), micro­code 0xd000375, 1x 1.92TB INTEL SSDSC2KG01, Ubun­tu 22.04.1 LTS, 5.18.12–051812-generic,  ZSTD v1.5.2, Rock­sDB v6.4.6 (db_bench), tes­ted by Intel  Octo­ber 2022.

10 Intel® Xeon® 8380: Test by Intel as of 10/7/2022. 1‑node, 2x Intel® Xeon® 8380 CPU, HT On, Tur­bo On, Total Memo­ry 256 GB (16x16GB 3200MT/s DDR4), BIOS Ver­si­on SE5C620.86B.01.01.0006.2207150335, ucode revision=0xd000375, Rocky Linux 8.6, Linux ver­si­on 4.18.0–372.26.1.el8_​6.crt1.x86_​64, YASK v3.05.07​

Intel® Xeon® CPU Max Series: Test by Intel as of ww36’22. 1‑node, 2x Intel® Xeon® CPU Max SeriesHT On, Tur­bo On, SNC4, Total Memo­ry 128 GB (8x16GB HBM2 3200MT/s), BIOS Ver­si­on SE5C7411.86B.8424.D03.2208100444, ucode revision=0x2c000020, Cent­OS Stream 8, Linux ver­si­on 5.19.0‑rc6.0712.intel_​next.1.x86_​64+server, YASK v3.05.07.

11 Up to 20% sys­tem power savings uti­li­zing 4th Gen Xeon Sca­lable with Opti­mi­zed Power mode on vs  off on sel­ect workloads inclu­ding Spec­JBB, SPECINT and NIGNX key handshake.

12 AMD Milan: Tes­ted by Numen­ta as of 11/28/2022. 1‑node, 2x AMD EPYC 7R13 on AWS m6a.48xlarge, 768 GB DDR4-3200, Ubun­tu 20.04 Ker­nel 5.15, Open­VI­NO 2022.3, BERT-Lar­ge, Sequence Length 512, Batch Size 1

Intel® Xeon® 8480+: Tes­ted by Numen­ta as of 11/28/2022. 1‑node, 2x Intel® Xeon® 8480+, 512 GB DDR5-4800, Ubun­tu 22.04 Ker­nel 5.17, Open­VI­NO 2022.3, Numen­ta-Opti­mi­zed BERT-Lar­ge, Sequence Length 512, Batch Size 1

Intel® Xeon® Max 9468: Tes­ted by Numen­ta as of 11/30/2022. 1‑node, 2x Intel® Xeon® Max 9468, 128 GB HBM2e 3200 MT/s, Ubun­tu 22.04 Ker­nel 5.15, Open­VI­NO 2022.3, Numen­ta-Opti­mi­zed BERT-Lar­ge, Sequence Length 512, Batch Size 1

13 Intel® Xeon® 8380: Test by Intel as of 10/28/2022. 1‑node, 2x Intel® Xeon® 8380 CPU, HT On, Tur­bo On, Total Memo­ry 256 GB (16x16GB 3200MT/s, Dual-Rank), BIOS Ver­si­on SE5C6200.86B.0020.P23.2103261309, ucode revision=0xd000270, Rocky Linux 8.6, Linux ver­si­on 4.18.0–372.19.1.el8_6.crt1.x86_64​

Intel® Xeon® CPU Max Series HBM: Test by Intel as of 10/28/2022. 1‑node, 2x Intel® Xeon® Max 9480, HT On, Tur­bo On, Total Memo­ry 128 GB HBM2e, BIOS EGSDCRB1.DWR.0085.D12.2207281916, ucode 0xac000040, SUSE Linux Enter­pri­se Ser­ver 15 SP3, Ker­nel 5.3.18, oneA­PI 2022.3.0​

Intel® Data Cen­ter GPU Max Series with DDR Host: Test by Intel as of 10/28/2022. 1‑node, 2x Intel® Xeon® Max 9480, HT On, Tur­bo On, Total Memo­ry 1024 GB DDR5-4800 + 128 GB HBM2e, Memo­ry Mode: Flat, HBM2e not used, 6x Intel® Data Cen­ter GPU Max Series, BIOS EGSDCRB1.DWR.0085.D12.2207281916, ucode 0xac000040, Aga­ma pvc-prq-54, SUSE Linux Enter­pri­se Ser­ver 15 SP3, Ker­nel 5.3.18, oneA­PI 2022.3.0​

Intel® Data Cen­ter GPU Max Series with HBM Host: Test by Intel as of 10/28/2022. 1‑node, 2x Intel® Xeon® Max 9480, HT On, Tur­bo On, Total Memo­ry 128 GB HBM2e, 6x Intel® Data Cen­ter GPU Max Series, BIOS EGSDCRB1.DWR.0085.D12.2207281916, ucode 0xac000040, Aga­ma pvc-prq-54, SUSE Linux Enter­pri­se Ser­ver 15 SP3, Ker­nel 5.3.18, oneA­PI 2022.3.0​