In the last year, conversations about In-Memory Computing (IMC) have become
more and more prevalent in enterprise IT circles, especially with
organizations feeling the pressure to process massive quantities of data at
the speed that is now being demanded by the Internet. The hype around IMC is
justified: tasks that once took hours to execute are streamlined down to
seconds by moving the computation and data from disk, directly to RAM.
Through this simple adjustment, analytics are happening in real-time, and
applications (as well as the development of applications) are working at-pace
with this new standard of technology and speed.
Despite becoming both more cost-effective and accepted within enterprise
computing, there are still a small handful of falsehoods that confuse even
the most technical of individuals in enterprise IT.
Myth: In-memory computing is about dat... (more)
A few months ago, I spoke at the conference where I explained the difference
between caching and an in-memory data grid. Today, having realized that many
people are also looking to better understand the difference between two major
categories in in-memory computing: In-Memory Database and In-Memory Data
Grid, I am sharing the succinct version of my thinking on this topic - thanks
to a recent analyst call that helped to put everything in place
Skip to conclusion to get the bottom line.
Let's clarify the naming and buzzwords first. In-Memory Database (IMDB) is a ... (more)
After five days (and eleven meetings) with new customers in Europe, Russia,
and the Middle East, I think time is right for another refinement of
in-memory computing's definition. To me, it is clear that our industry is
lagging when it comes to explaining in-memory computing to potential
customers and defining what in-memory computing is really about. We struggle
to come up with a simple, understandable definition of what in-memory
computing is all about, what problems it solves, and what uses are a good fit
for the technology.
In-Memory Computing: What Is It?
In-memory computin... (more)
by Abe Kleinfeld and Nikita Ivanov
Gordon E. Moore's famously predicted tech explosion was prophetic, but it may
have hit a snag. While the number of transistors on integrated circuits has
doubled approximately every two years since his 1965 paper, the ability to
process and transact on data hasn't. We're now ingesting data faster than we
can make sense of it, leaving computing at an impasse. Without a new
approach, the innovation promised by the combination of Big Data and internet
scale may be like the flying cars we thought we'd see by 2014. Fortunately,
this is is not the c... (more)
If you know anything about Hadoop architecture - the task seemed daunting to
us and it proved to be one of the most challenging engineering feat that we
have accomplished so far.
After almost 24 months of development, tens of thousands of lines of Java,
Scala and C++ code, multiple design iterations, several releases and dozens
of benchmarks later we have the product that can deliver real-time
performance to Hadoop with only minimal integration and no ETL required.
Backed-up by customer deployments that prove our performance claims and
validate our architecture.
Here's how we d... (more)