Story image

Crucial expands with its highest density server memory to date

31 May 2017

The organisation Micron Technology specialises in memory solutions. Through their global brands, the company delivers solutions for computing, consumer, enterprise storage, data center, mobile, embedded, and automotive applications.

Created in 1996, Crucial is one of Micron Technology’s brands.

Crucial has expanded its server memory portfolio with the its highest density server memory module to date: 128GB DDR4 LRDIMMs. 

Features of the new modules include speeds starting at 2666 MT/s and memory modules that aim to increase the installed memory capacity per server.

Key features of the new product portfolio include:

  • Increased performance and memory capacity with module densities up to 128GB
  • Available in speeds starting at 2666 MT/s
  • 100% component and module tested to mission-critical server standards
  • Compatible with OEM servers
  • Optimised for future Intel Xeon processor product families
  • Backed by a limited lifetime warranty and the Crucial Reliance Program2

The new modules, with increased system memory density, wil aim to accelerate memory-intensive server applications.

Crucial explains that these memory-dependent server applications like virtualisation, in-memory database computing, and high performance computing (HPC) require massive amounts of available RAM.

The new modules support a range computing applications including Microsoft SQL, Oracle, Microsoft Azure, VMware VDI, Cloudera, Hortonworks and SAP HANA.

Crucial claims that their a 34-stage manufacturing process and more than 100 tests and verifications ensure each module is 100% component-and-module tested to mission-critical server standards.

“We’re committed to helping organisations of all sizes reduce costly server performance slowdowns by offering reliable, quality, and high-density server memory,” comments Michael Moreland, Crucial worldwide product manager.

“These new Crucial 128GB DDR4 LRDIMM modules will help IT Managers extend the life of their current infrastructure as well as optimise new server deployments with best-in-class quality of service in mind.”

Why Aussie companies are struggling with data
The top culprits in poor data quality in Oz are human error, different data sources, lack of comms, inadequate strategy, and too much information.
Machine learning is a tool and the bad guys are using it
KPMG NZ’s CIO and ESET’s CTO spoke at a recent cybersecurity conference about how machine learning and data analytics are not to be feared, but used.
Pure Storage expands enterprise data management solutions
It has integrated StorReduce technologies for a cloud-native back up platform, and expanded its data fabric solution for cloud-based applications.
Seagate: Data trends, opportunities, and challenges at the edge
The development of edge technology and the rise of big data have brought many opportunities for data infrastructure companies to the fore.
TIBCO announces API management solution with cloud-native design
The platform aims to deliver key API management capabilities for enterprises adopting cloud-native development and deployment practices.
How blockchain could help stop video piracy in its tracks
An Australian video tech firm has successfully tested a blockchain trial that could end up being a welcome relief for video creators and the fight against video piracy.
How to gain an edge with data analytics in 2019
"With greater reliance on AI and machine learning comes human hesitation about the trustworthiness of model-driven recommendations."
Veeam achieves backup certification for SAP HANA
"SAP HANA enterprise customers can take advantage of Veeam’s backup solution for their performance-sensitive SAP environments."