Give it a try, and provision an arbitrary number of Hadoop cluster on your laptop (or production environment), using our container and Ambari shell. Let us know how it works for you. Enjoy.
Get the Docker container
In case you don’t have Docker browse among our previous posts – we have a few posts about howto’s, examples and best practices in general for Docker and in particular about how to run the full Hadoop stack on Docker.
Once you have the container you are almost ready to go – we always automate everything and over simplify Hadoop provisioning.
Get the following
ambari-functions file from our GitHub.
Create your cluster
Whaaat? No really, that’s it – we have just provisioned you a 4 node Hadoop cluster in less than 2 minutes. Docker, Apache Ambari and Ambari Shell combined is quite powerful, isn’t it? You can always start playing with your desired services by changing the blueprints – the full Hadoop stack is supported.
If you’d like to play around and understand how this works check our previous blog posts – a good start is this first post about one of our contribution, the Ambari Shell.
You have just seen how easy is to provision a Hadoop cluster on your laptop, if you’d like to see how we provision a Hadoop cluster in the cloud using the very same Docker image you can check our open source, cloud agnostic Hadoop as a Service API – Cloudbreak. Last week we have released a project called Periscope – the industry’s first open source autoscaling API for Hadoop.