Dell and Big Data: infinite possibilities

This week I’m headed off to the Big Data Innovation Summit (BDIS) 2014 in Santa Clara, CA, to present, learn and gain a better understanding of the innovative technologies and latest developments in the Hadoop ecosystem. Michael Dell recently penned a blog highlighting how customers utilize Hadoop to determine how to analyze all of their data to deliver a better experience for their customers while building new business models to gain a competitive advantage. One thousand customers from over 80 industries will be in attendance at BDIS with 56 industry leaders presenting use cases and sharing their expertise. I will be right there with customers conversing, learning, listening, and asking as many questions as possible.

The sharing of experiences and expertise is crucial to the adoption of Hadoop and driving the ecosystem. As customers evolve with their Big Data solutions, their needs also mature, therefore it’s crucial to learn how customers use Hadoop and what use cases deliver the highest return to the business. Dell has delivered Hadoop expertise by listening to and collaborating with customers to provide a valued set of tools to help customers build their environments. Customers initially asked Dell to focus on helping them build their Hadoop clusters from the ground up with simple, easy to use deployments and reference architectures serving as blueprints. Today, customers are asking Dell to build engineered Big Data use case solutions that are easier to deploy, integrate, and scale. Dell has responded.

Additionally, Dell’s Big Data Team will be hosting a panel session, “HPC and Big Data – Convergence or Competition.” The session will explore how organizations with HPC implementations can leverage existing storage infrastructure and high performance file systems to execute map reduce processing. Niall Gaffney from the Texas Advanced Computing Center will talk about his latest research illustrating how HPC technologies and environments can benefit from data-oriented computational workloads within Hadoop.

Dell wants to enable HPC customers to use existing infrastructure thereby eliminating the required up-front expense for dedicated hardware for MapReduce jobs. Plus, this enables customers to utilize their current high performance file system, allowing them to use the expertise and file system they have in place today without modification – saving time and money.

I’m excited and ready to learn so I can help build the most relevant, customer inspired Big Data solutions. If you would like to learn more about Dell Hadoop Solutions, please visit Dell.com/Hadoop or visit our Microsite, and watch this space for a recap of BDIS in the coming weeks.

About the Author: Armando Acosta

Armando Acosta has been involved in the IT Industry over the last 15 years with experience in architecting IT solutions and product-marketing, management, planning, and strategy. Armando’s latest role has been focused on Big Data|Hadoop solutions, addressing solutions that build new capabilities for emerging customer needs, and assists with the roadmap for new products and features. Armando is a graduate of University of Texas at Austin and resides in Austin, TX.