by Martin Brown | Published October 22, 2013
In this series of articles, we'll look at a range of different methods for integration between Apache Hadoop and traditional SQL databases, including simple data exchange methods, live data sharing and exchange between the two systems, and the use of SQL-based layers on top of Apache Hadoop, including HBase and Hive, to act as the method of integration.
Examine some of the basic architectural aspects of exchanging information and the basic techniques for performing data interchange.
Explore how and when to use HBase and Hive to exchange data with your SQL data stores. meta_description: Explore how…
Big data is a term that has been used regularly now for almost a decade, and it -- along with…
May 13, 2019
Get the Code »
Back to top