Kick It Up a Notch

Many techniques for High Performance Data Analytics have been around since the 60s, but as the size and complexity of big data needs grow, so will the need for high performance computing (HPC). PSS delivers experience applying modern cloud frameworks within an HPC environment. Instead of Hadoop Distributed File System (HDFS), we use General Parallel File System (GPFS) as the underlying storage layer, providing at least a 10x increase in speed in storage and even more speed in retrieval. 

Computing That’s Faster, Better, Stronger

High-performance systems often use custom-made components in addition to so-called commodity components. PSS specializes in leveraging (HPC) combined with big data to introduce innovative capabilities at a new level of scalability. We have experience delivering large scale in-memory key/value stores, machine learning, and massively parallel streaming analytics on HPC.

In addition, PSS experts are skilled in ontological RDF modeling and scaling to the billions of nodes, as well as:

  • High speed streaming with IBM Infosphere and Apache Storm
  • Massive memory stores with Redis
  • High-throughput messaging with Kafka

We also specialize in custom message-passing via MPI programming, which can replace slow map-reduce programs, together with the use of specialized graph-based architectures, such as Cray’s Urika system.

Demand high performance.