And it’s cheaper too!

Maximize G&G Application Performance & Lower TCO Simultaneously

As oil and gas companies wrestle with delivering dramatic reductions in their operating and capital budgets, whilst maintaining a razor sharp focus on safe and efficient operations, many people are asking how or even if IT can support companies in these challenging times. But whatever answer we might come up with, the starting point in today’s climate must be to deliver material reductions in cost. Oh and you had better remember that at the last count the industry has 250,000 less employees, so make sure you figure out how to transform user productivity.

So how can we drive up user productivity and reduce costs simultaneously? Recently we have been doing a lot of engineering work on next generation geoscience systems. Interpretation and modelling applications tend to be heavily workstation-oriented where individual users are equipped with expensive self-contained computing resources. When data is required, a low-latency high-bandwidth transfer from network storage to the workstation is needed to achieve useful levels of performance and productivity.

However as data velocity and volume increase, sustaining workstation-based applications is a real challenge. Individual workstations need ever increasing levels of computing power, memory and storage to cope, and delivering the required high-bandwidth low-latency I/O becomes eye-wateringly expensive.

Wstn Apprch

This rather expensive exercise places IT under constant pressure to deliver computing resources to meet the largest expected workload at the time, which means that workstations are either often over-specified compared to average expected workloads or under-specified, leading to user frustration and inefficient working practices. The overall result is that computing power, storage and memory cannot be correctly balanced to workloads, since the high-end resources are not always needed and cannot be shared.

pic2

There is also a penalty in workload throughput. We have observed cases where geoscientists need to wait as long as 30 minutes to load projects into their applications. This has a negative impact on team productivity and agility, particularly when seismic data forms a critical part of the workflow.

The core strategy to address these challenges is the centralization of computational resources (both CPU and GPU) inside the data center using Converged Infrastructure. Essentially, this approach takes the enormous amount of computational power that is out on workstations and relocates it back into the data center. There are a couple of key reasons why this is beneficial:

  1. Operational Efficiency – the workstation-oriented approach leaves much of the computing resource underutilized and requires an expensive IT support mechanism. Having a shared set of central resources enables better provisioning of appropriate resources to users with thin-client devices – far more with less
  2. Efficient Network Utilization – by co-locating computation resources with the data, we remove the need to shift large volumes of data over networks to individual workstations, given wider easier access to a rich data set – again far more with less

VDI Petrotech App

TCO studies have shown that EMC’s Petrotechnical Appliance – a Converged Infrastructure solution – can achieve cost savings in excess of 35% by migrating from distributed workstation computing to centralized computing delivered through VDI. Simultaneously, end-users are able to experience a more consistent delivery of computing resources to match individual workload demands without needing expensive workstation upgrades – so higher end user productivity and lower costs!

SEG 2015 - EMC Article Submission - TCO image
The EMC Petrotechnical Appliance is based on the industry-leading Vblock® Converged Infrastructure from VCE. It is being increasingly adopted by oil & gas companies and is recommended by leading Oil Field Services companies as a key ingredient for optimizing geoscience operations, particularly in the current oil & gas economic climate.

About the Author: David Holmes