Tuesday, February 22, 2011

Several Node Managers in Oracle WebLogic on one machine

This is an age old dilemma for people like me. You work in a project, that has several environments (DEV, TEST, QA, PRD). Typically as a project member you are not allowed access to QA and PRD - which is very good. The main problem is that the QA and PRD environments offer high availability.

In order to provide a good installation manual you need to come up with clustering even when you do not have enough machines.

So for my current project I have chosen a different approach.
I have four blades, three with 1 CPU (dual core) and one with 2 CPU's (dual core as well).

I use one for the database, one for the middle tier (the bigger one), one for the web tier, and the fourth one for the load balancer and firewall.

The load balancer and firewall will be done with iptables and HAProxy.

Now on the middle tier I needed to have 14 different nodes.
I created virtual NIC's and put them into the hosts file as well.

In order to have "independent" WebLogic Servers I installed them in different WLS_HOME directories. Now I want a Node manager, an Admin Server and at least one Managed Server per NIC. This proved a little bit more difficult than expected.

Changing the Node Manager to run on a different port is one activity that takes three steps already. As all my Managed Servers will run on SSL as well I ran into a problem with the BEA-090482. Solving this meant to recreate the Java Keystore for the virtual NIC hostname. After this I was capable of starting everything independently.

I will try to put everything as a document tonight and place it on my website.

No comments: