Hortonworks.com
  • Explore
    • All Tags
    • All Questions
    • All Repos
    • All SKB
    • All Articles
    • All Ideas
    • All Repos
    • All SKB
    • All Articles
    • All Ideas
    • All Users
    • All Badges
    • Leaderboard
  • Create
    • Ask a question
    • Add Repo
    • Create Article
    • Post Idea
    • Add Repo
    • Create Article
    • Post Idea
  • Tracks
    • All Tracks
    • Community Help
    • Cloud & Operations
    • CyberSecurity
    • Data Ingestion & Streaming
    • Data Processing
    • Data Science & Advanced Analytics
    • Design & Architecture
    • Governance & Lifecycle
    • Hadoop Core
    • Sandbox & Learning
    • Security
    • Solutions
  • Login
HCC Hortonworks Community Connection
  • Home /
  • Hadoop Core /
avatar image

how to build ambari cluster on one node

Question by Michael Bronson Apr 15 at 10:57 AM ambari-serverambari-serviceambari-agentambari-blueprinthdp-2.3.4

until now we build every ambari cluster , as the following structure:

3 Journal nodes machines ( inside each journal node we build the kafka )

3 workers machines (data-node machines)

but in order to minimize the costs we want to build one linux machine that includes one Journal node machine ( master ) + worker + kafka

dose HORTON-WORKS have procedure for this structure - to build Journal node + worker + kafka on one single machine ?

Comment

People who voted for this

0 Show 0
10 |6000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users

1 Reply

· Add your reply
  • Sort: 
  • Votes
  • Created
  • Oldest
avatar image
Best Answer

Answer by Jay Kumar SenSharma · Apr 15 at 11:12 AM

@Michael Bronson

Do you mean that there is no NameNode on that single host? As you are talking about JournalNode so do you means that you want to have NameNode HA on a single node machine?
.

If NameNode HA is not a requirement then you can take a look at the HDP Sandbox and then export it's Blueprint then create a cluster based on that. Or Just take a HDP Sandbox then delete whatever unwanted services/components that you do not want in your single node cluster and then you can either use the same or export it's blueprint and then use it.

Comment
Michael Bronson

People who voted for this

1 Show 8 · Share
10 |6000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users
avatar image Michael Bronson · Apr 15 at 11:26 AM 0
Share

namenode should be defined on this machine , all we want is to minimize all in one machine ( amster + worker + kafka )

avatar image Michael Bronson · Apr 15 at 11:37 AM 0
Share

@Jay regarding the sendbox from where to download the BP file? , as you know we separate the worker from the master machine so how to integrate them on one machine )

avatar image Jay Kumar SenSharma ♦ Michael Bronson · Apr 15 at 11:52 AM 0
Share

@Michael Bronson

HDP sandbox is a Single Node Cluster. Which can be downloaded from https://hortonworks.com/products/sandbox/

Following is the link which shows how to use the HDP Sandbox (Single Node Cluster) https://hortonworks.com/tutorial/learning-the-ropes-of-the-hortonworks-sandbox/

avatar image Michael Bronson · Apr 15 at 11:57 AM 0
Share

@Jay just to be clear I want to install the sandbox on my existing Linux redhat machine ( I am already configured the host-name and IP on that local machine ) , so this is relevant for me ?

avatar image Michael Bronson · Apr 16 at 06:01 AM 0
Share

@Jay , we have some problem when we deploy OVA template , can you get you assit regarding that - https://community.hortonworks.com/questions/186221/cant-deploy-sandbox-hdp-264-ova-template-from-vsph.html

avatar image Jay Kumar SenSharma ♦ Michael Bronson · Apr 16 at 06:37 AM 0
Share

@Michael Bronson

I was re-reading this thread and looks like i have misunderstood your requirement.

Can you please confirm if my understanding regarding your query is correct or not?

Based on your issue description i understood that you want to configure

1. A Single Node Cluster.

2. That single node cluster should have "one Journal node machine ( master ) + worker + kafka". Means in thjis Single Node you want to install a JournalNode + DataNode + Kafka Broker


My Query: In this case where will the NameNode be. Also as you are using a term Journal Node which is available in NameNode High Availability mode . So are you planning to install both the name Nodes in the same Single Host (by making some port conflict resolution) ?

avatar image Michael Bronson · Apr 16 at 08:03 AM 0
Share

@Jay your steps 1 and 2 are correct , we want to install the "master" + "worker" + "kafka" on one single linux node ,

Show more comments

Your answer

Hint: You can notify a user about this post by typing @username

Up to 5 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

38
Followers
follow question

Answers Answer & comments

HCC Guidelines | HCC FAQs | HCC Privacy Policy

Hortonworks - Develops, Distributes and Supports Open Enterprise Hadoop.

© 2011-2017 Hortonworks Inc. All Rights Reserved.
Hadoop, Falcon, Atlas, Sqoop, Flume, Kafka, Pig, Hive, HBase, Accumulo, Storm, Solr, Spark, Ranger, Knox, Ambari, ZooKeeper, Oozie and the Hadoop elephant logo are trademarks of the Apache Software Foundation.
Privacy Policy | Terms of Service

HCC Guidelines | HCC FAQs | HCC Privacy Policy | Privacy Policy | Terms of Service

© 2011-2018 Hortonworks Inc. All Rights Reserved.

Hadoop, Falcon, Atlas, Sqoop, Flume, Kafka, Pig, Hive, HBase, Accumulo, Storm, Solr, Spark, Ranger, Knox, Ambari, ZooKeeper, Oozie and the Hadoop elephant logo are trademarks of the Apache Software Foundation.

  • Anonymous
  • Login
  • Create
  • Ask a question
  • Add Repo
  • Create SupportKB
  • Create Article
  • Post Idea
  • Add Repo
  • Create SupportKB
  • Create Article
  • Post Idea
  • Tracks
  • Community Help
  • Cloud & Operations
  • CyberSecurity
  • Data Ingestion & Streaming
  • Data Processing
  • Data Science & Advanced Analytics
  • Design & Architecture
  • Governance & Lifecycle
  • Hadoop Core
  • Sandbox & Learning
  • Security
  • Solutions
  • Explore
  • All Tags
  • All Questions
  • All Repos
  • All SKB
  • All Articles
  • All Ideas
  • All Repos
  • All SKB
  • All Articles
  • All Ideas
  • All Users
  • Leaderboard
  • All Badges