Installing Connections

In this blog we’re going to look at installing a new version of Connections 5.5 then updating it with the latest fixes.

Setting Up The Repository
We start installing Connections by configuring a new repository in Installation Manager. Having extracted the Connections installer the repository.config will be found in the IBM Connections directory.

skitch.33

We opted to deselect the existing repositories that we used to install WebSphere.  We may need them later so deselecting is a better option than removing them entirely.  If however you do delete the installers from your file system you should remove them from the list of repositories as well

skitch.32

What Are We Installing?
The only product we can install is IBM Connections Version 5.5.0.0.  Don’t worry about the fixes and updates at this point, we are just doing a base install initially.

skitch.31

Since we have never installed Connections on this machine Installation Manager will create a new package group to track the software.  The previous package group is used for WebSphere products only.

skitch.30

The next page has a list of features that can be installed as part of Connections.  By default all are selected and I would always install all for a standard  Connections build.  If you scroll down this list there are “Optional Components” such as Connections Content Manager which are not selected by default.  This would require additional licensing.

skitch.29

Validation
The next few screens take us through the Connections install and validate we have everything else in place.  You need to have installed and configured WebSphere as well as DB2 (or SQL/Oracle), created the databases and preferably populated them before you can install Connections.

Connecting To The Deployment Manager
On the first screen the installer needs to validate that WebSphere is in place and that it can login  The account listed here as “Administrator User ID” will by default be given access to all the Connections applications as they install.  You can’t click “next” on this screen until you click “Validate” and that changes to “Validated“.  To achieve that the installer will try to login to the deployment manager using the credentials you show.

Make sure the deployment manager is running or the installer will not be able to login.

skitch.28

 

You may receive a warning at this point that application security is not enabled and that will prevent the installer from validating.  If that happens, leave the installer on this screen and log back into the ISC on https://hostname:9043/ibm/console.

Go to Security – Global Security and choose “Enable application security”.  By default the box is unchecked. It needs to be checked as shown below and the configuration saved, then restart Deployment manager and go back to your Installer screen to try validating again.

skitch.25

If you receive a warning about missing SSO configuration when you try to enable application security simply click on “Single Sign-on (SSO)” under “Web and SIP security” on the right and enter the domain you are using for this environment as shown below.

skitch.26

So.. back to our installer screen and we have the nice “Validated” button at the bottom so we can now click “Next”

skitch.22

Where Do The Applications Go?
The topology page is where we choose how to deploy the Connections applications onto which servers.  IBM tries to help you here by asking what size deployment you want

You should only ever choose a “Small” deployment where every application is on a single WAS server instance for a test or development environment.  Bear in mind the server will be slow to start as every application will be on it. Also finding debug information in the logs may be harder since each log file will contain all content for all applications.

The most common option is “Medium” where the applications are spread across multiple server clusters.  Each server has its own allocation of memory and runs only the applications assigned to it.  In addition if we wanted to grow the environment by adding a cluster instance, we could choose to do so for only the most under demand servers/applications. IBM provide a suggestion for which applications should be assigned to which servers but it’s only a suggestion – you know your business and your requirements best.  Bear in mind it’s a good idea to spread the applications that will be under heavy load across different servers.

The “Large” deployment creates one server per application which could mean 20 or more server instances.  This would only be done for extremely large Enterprise deployments and will need to be planned carefully across multiple nodes.

skitch.21

For Connections101 we have chosen a “Medium” topology so IBM offers to create four individual server clusters to put the applications on.  Since our environment currently has only one node , all those instances will be created in that node.  This is an opportunity to move applications around between servers and even rename servers if we want.

You should already have a design in mind before you get to this point of the install.  If not stop now and think about how you want the applications deployed across how many servers.

skitch.20

Verifying The Databases
Our third page is where we tell the installer where the database server and databases are.  The installer will take the settings you provide here and attempt to access each database using those settings and credentials.  If you created a LCUSER account to use when running the DBWizard you would use those credentials here.  In the past on Linux there have been issues with case sensitivity for the account name so make sure if the account you create is ‘lcuser” you enter “lcuser” in lower case for the credentials.

See the earlier blog on the DBWizard for how to create the lcuser account

skitch.19

Defining The HTTP Server
On the fourth page we can choose to tell the installer about the IBM HTTP Server we created earlier.  This is a good idea and will save you work post install as the installer will ensure all the mappings for the applications to use IHS as a front end are in place including in the LotusConnections-Config.xml.  If you haven’t already created an IHS instance then you can choose “Do Later”.

Since we have already created the IHS node in our earlier blog and called it webserver1 all we have to do is choose “Do now” and the installer will select our server automatically.

skitch.18

 

Cognos. Leave it for now🙂
We would always recommend installing Cognos later. The steps involved are fairly complex and take us away from the base Connections application install.  Once Connections is installed and updated (and backed up / snapshotted), we can come back and install Cognos.

Choosing not to install Cognos does not prevent the installer for installing the Metrics application which gathers activity for Cognos and other 3rd party tools to use so you will not lose data if Cognos isn’t immediately deployed.

skitch.17

Where Is The Data Going To Go?
On the content store page the installer needs to be told where to create and store data.  There are two content stores that need to be defined and both are used by every server.

  • a shared network content store.  The location of which must be accessible by every server regardless of platform.  Each server must use the identical path to find the content store so the path we choose to enter here has to be visible and accessible to each server.  If you have a mixed Windows and Linux environment or choose to put your content store on a Linux share, ensure that there is a single path that can reach that store from every server.
  • a local network content store.  The data in the local store is only used by the server on that machine and should be unique to each machine, however the path to the local store should be identical for each server.  We can workaround this post-install by modifying environment variables later on but for now we should plan for the path to the local store to be consistent.

The validation check on this page only verifies that the paths and directories exist to the installer.  It doesn’t verify that all servers can see them or even that they are writable as well as readable locations.  That’s something you need to confirm yourself

skitch.16

Mail Notifications And Replies
On the next tab we can configure the mail notifications.  This is something that is fairly straightforward to do post-install so if you haven’t considered how you want notifications to work you can stop and do that now.  If you think you might want to change things in the future then come back to it later.

We chose to have notifications (emails) to users as well as granting users the ability to reply to mail.  Since WebSphere has no mailboxes of its own the reply to messages must be delivered to an IMAP accessible mailbox on a server.

skitch.15

When defining how to find a mail server to send mail we have two choices We can use WebSphere’s own built in mail session to have SMTP mail sent to a specific server or we can rely on MX records in DNS to route mail for onward delivery. These options can also be easily changed post install.

Always use a dedicated account for sending and collecting mail and if possible use secure SMTP so the credentials cannot be intercepted en route.  The reason I prefer to use a dedicated account is to ensure the account is only used by Connections. It’s far too easy for a shared account to be renamed, deleted or even have its password changed without anyone thinking to tell the Connections admin!

skitch.14

Role Mapping
This is a new and much welcome feature in 5.5.  Here the installer gives us the option of adding users from our LDAP directory who will automatically be assigned administrative rights to each application (rather than just the connadmin account i’m using for the install).  Similarly we can add individual users as moderators and those will be granted the correct rights to provide moderation across applications.

One of the first things we used to do post install is go into each of the 20 or so applications and manually add additional administrative users and moderators so telling the installer now to do it for us saves a lot of time.  However, what I really want to do is add groups not individuals.  That way I could have a LDAP group called “ConnectionsAdmins” and have that added here.

If you are going to want to use groups rather than users for security it’s best to leave these options empty as this point and go back post install to manually add the groups.

skitch.12

We can now complete the install.  This could take anywhere from 20 minutes to 2 hours depending on the complexity of your environment, disk and network speed.

skitch.10

The lovely green tick is what we hope to see.  If the install fails, open the install log and read it carefully.  In my experience the install can often fail for three reasons

1. The person running the install doesn’t have the authority to write to the WebSphere servers or the file system

2. You run out of disk space on a partition during the install

3. There are old files hanging around from an earlier failed installed that need to be removed

skitch.9

Checking Everything Installed
Once the install says it completed successfully, we login to the ISC and verify all the applications are listed under “Enterprise Applications”

skitch.8

We also check all the servers we asked to be created on the topology page of the installer do now exist.

skitch.7

So that’s the install.  However before you go any further you are going to want to install any and all patches that have come out since the base 5.5 version of Connections was released.

Applying Fixes
Even if you think you have all the most recent fixes, now is the time to go to Fix Central and check there are no new updates for Connections 5.5

skitch.6

If you find new updates you should download them to your file system.  The very first thing we need to do is extract the updater file 5.5.0.0-IC-Multi-UPDI-20151224.zip and find the .zip or .tar file within that.  That compressed file will itself extract into a directory called updateInstaller.

The updateInstaller must be extracted under the Connections install directory which in our environment is /opt/IBM/Connections.  Once extracted it looks like the screenshot below.

skitch.5

Any other fixes (jar files) that you download from fix central should be copied to the “fixes” directory here.

To run the update wizard for Connections, which applies all the fixes we run updateWizard.sh / bat but first we need to do two things

  1. Run setupcmdline from the Dmgr bin directory so /opt/IBM/WebSphere/AppServer/profiles/Dmgr01/bin/setupCmdLine.sh
  2. Set the WAS_HOME variable (as shown above)

Now we can run updateWizard

skitch.4

The Update Wizard can both install and remove updates.  That’s useful to us because there may be some cases when we may want to easily remove an update once it has been applied.

In this case we’re going to Install updates.  The updater lets us choose where the fixes are located so in theory we could put them anywhere and not just in the “fixes” directory under “/Connections/updateInstaller” however it’s safest to put them there and work from that directory.

skitch.3

The installer finds any fixes that were in that directory and asks us which ones to deploy showing the application affected (if it’s just a single application) and the release date of the fix.  skitch.2

In this case, because we went to apply all fixes at once, the upgrade wizard warns us that some of the fixes are already obsolete so we can go back to the previous screen and deselect those.

skitch.1

The update wizard gives us a final warning that we need to backup our customisation directory. At this point we are on a new install and have no customisations so we can go ahead and say we have made no changes.

If you have made changes, even if you don’t think the update you’re applying should affect those changes, make sure you manually back them up.

skitch

Finally it will run and apply all the updates.  During this process it will remove applications, add new applications, change configuration settings, clear out temporary data etc. In fact it could be completing a lot more work than happens during an install – this could take a long time.

If you are concerned about the progress of the updater it will be creating log files in the Connections install directory under the version / log location e.g /opt/IBM/Connections/version/log.  You can monitor this directory to see each update as it happens and reassure yourself things are progressing.

If you have problems with Connections after an upgrade IBM will want to see those log files so do not delete them.

Finally the updater will finish and confirm what it managed to successfully complete.

skitch.35

Confirming What Version Is Installed

If you’re a control freak like me you might now like to confirm the version of Connections you’re on by running the updateSilent command from the file system.  It can be found in the updateInstaller directory where we ran the wizard from and the syntax is

updateSilent -fix -installDir as shown in the screenshot below

skitch.34

and we’re done.  Now go make yourself a coffee and relax before we go onto the next step !

 

Running The Population Wizard

The first thing we’re going to want to do once the Connections applications successfully install is test them by logging in as a regular user.  To do that the user must have a profile in the PEOPLEDB or Connections will throw an error and the easiest way to ensure that exists is to run the Population Wizard.

The Population Wizard is a simple tool that asks a series of questions such as where is your LDAP directory, where are your DB2 databases, and where is TDI and uses those answers to run a one off script using TDI to import all users into Connections, creating them a profile along the way.  It’s designed to work in only one direction (you can pull information from LDAP to databases only, not the other way around).

For a test environment or a very small manually-maintained environment you could run the PopulationWizard to update the databases whenever you want but for production it’s not really workable.  However, this is just to ensure we have some data in place that we can use for testing.  We will work with the custom TDI scripts later to fine tune what we want.

To run the population wizard look in the “Wizards” directory from the ConnectionsWizards download.  The file will be called populationWizard.sh (or populationWizard.bat).

skitch.6

skitch.5

The JDBC drive path must contain the drivers for your database platform.  If you aren’t running TDI on the same server as your databases you will need to copy the drivers from the database server to the TDI server to reference them here.

The User ID is the account granted full rights to the PEOPLEDB database.  This can often be a custom account and not (as in this case) the instance owner.

skitch.4

Here we tell the wizard where our LDAP server is and how to connect to it.  If you are using a secure port such as 636 make sure you check the box for “Use SSL communication”

skitch.3

These are the bind credentials to login and query LDAP.  They aren’t stored anywhere in this activity or used for any purpose other than running this one-off wizard so any credentials would do.

skitch.2

This shows the sample mapping of LDAP attributes to database fields. For example “mail” in LDAP will map to the field “Office Email”.  On this screen we can choose what attributes to map where and even if we want to map attributes at all.  Anything we don’t map won’t be populated with data and will appear as empty in Connections.

Once the populationWizard completes it should report that it imported your users.  To do this it wrote instructions to properties files and ran script files stored in the location:

/Wizards/TDIPopulation/linux/TDI –

The files it wrote our instructions to are:

profiles_tdi.properties
solutions.properties
mapdb_repos_from_source.properties

The script files it ran were:

collect_dns.sh
populate_from_dn_file.sh

The activity is recorded in /Wizards/TDIPopulation/linux/TDI/Logs.

We’ll come back to this later when we start customising our syncing activity.

Installing TDI

Tivoli Directory Integrator is used to move data between our LDAP source (in this case Domino) and our DB2 (or SQL or Oracle) profiles database (PEOPLEDB).  IBM supply a range of Wizards and batch files to achieve this and they all depend on having TDI installed somewhere.

Once the TDI installer is extracted we run ./launchpad.sh in Linux (or launchpad.exe in Windows) to start installing TDI.

On linux this may fail because there is a script that validates if a supported browser version is installed.  The script is a bit old and only checks for Firefox numbers beginning 1x and 2x – we’re currently on Firefox 35 and growing. To workaround that validation modify the file browser.sh in the launchpad directory as follows

skitch.13

The regedit syntax using [1-9][0-9] should cover any version Mozilla release for a couple of years🙂

Now launchpad.sh should run

skitch.10

skitch.9

I choose a custom install because we only need a limited set of features installed for our purposes.

skitch.8

In theory I only need the Server component – the CE (Configuration Editor) is for creating and managing AssemblyLines.  In a simple install we don’t use it but I like to install it regardless and it’s very likely we will use it to customise our directory sync activity.

skitch.7

The working directory should not be a static choice. Each time one of our TDI scripts runs, the parameters for that script will come from the directory it exists in.  By not setting a working directory we are able to run multiple scripts from different locations with different settings.

Once TDI 7.1.1 is installed we need to patch it to fixpack 5 before doing anything further. After downloading the fixpack (7.1.1-TIV-TDI-FP0005.zip) and extracting it there will be a file called UpdateInstaller.jar in the extracted directory.

Copy the new UpdateInstaller.jar file to /opt/IBM/TDI/V7.1.1/maintenance

Copy the extracted fix TDI-7.1.1-FP0005.zip to a convenient location – in this case I’m using the directory where I’m going to run the update from /opt/IBM/TDI/V7.1.1/bin

skitch.1

To apply the fixpack we run ./applyUpdates.sh -update <fixpackfilelocation>

 

skitch

Once the updates have been applied we can verify it ran successfully using:

./applyUpdate.sh -queryreg

which shows us both components installed and patch levels for each.

In our environment we are installing TDI on the same server where DB2 is installed but if we weren’t we would have to copy the DB2 library files over to the TDI server so the two can communicate.

Setting Up An LDAP Repository

Before we do anything else we now need to make sure our users can login.

WebSphere only has an internal file repository that will only contain the “wasadmin” entry we created when installing.  The users and groups we want to be able to use Connections aren’t in that File Repository, they are in a LDAP directory (or directories) somewhere.  We need to tell WebSphere’s deployment manager where to find and authenticate those users.

First we need to consider where our directory is going to be and if there will be one or multiple directories (if all users aren’t held in one place).  For instance, if all my users login to Active Directory I might use that, or I might use Domino which itself could be running Directory Assistance.

When deciding on LDAP directory configuration:

  1. As few directories as possible should be referenced
  2. Every user should have a unique key in the directory
  3. If you use multiple directories each user should appear only once with their unique key. If my key is my email, gabriella@connections101.info, then there should only be one entry with that name across all the directories we reference.
  4. The directory should be a trusted source with strong password validation

In this case we’re going to choose a single Domino LDAP server but we could have just as easily chosen Active Directory or many other LDAP directory sources.

Adding external directories in WebSphere is known as adding Federated Repositories. Before modifying or adding any federated repository I like to take a backup of the Deployment Manager.  Very often if the LDAP configuration is wrong you can end up locking yourself out of WebSphere entirely and you will need to restore from that backup.

To backup go to the /bin directory under /profiles/Dmgr01 and run:

./backupConfif.sh <locationofzipfile> -nostop

skitch.11

To add or modify a Federated Repository we go to Global Security in the ISC and choose “Configure” next to the “Available realm definitions – Federated repositories”:

skitch.9

We are going to add an LDAP repository:

skitch.8

Here we enter the details of the LDAP directory source we’re going to use.  In this case I have a Domino server running LDAP on hostname ldap.connections101.info and secure port 636.  Note how the “Require SSL Communications” checkbox is enabled, you must select this if you are connecting to LDAP securely.

Choosing the right Directory type from the drop down list ensures that the LDAP query syntax is correctly formed for the directory you are accessing. Selecting “IBM Lotus Domino” when pointing to an Active Directory server will prevent the directory from working.

I do not recommend using bind credentials to login to your LDAP directory unless you are also using a secure SSL protocol to connect. If you are connecting using 389 or another non-encrypted port I would suggest an anonymous bind.

The field “Federated repository properties for login” will determine what values can be used to login.  These are all LDAP attributes that map to fields in the source directories. If possible in a Connections environment we want to have “uid” listed first.  UID maps to the value ‘shortname’ in Domino LDAP, ‘mail’ maps to the internet mail address and ‘cn’ maps to my fullname.  With the values set here as uid,mail,cn I will be able to login using variations of my name including:

gdavis
gabriella@turtlepartnership.com
Gabriella Davis
Gabriella Davis/Turtle

Finally the value for failover server can be used to point to other identical directories. Although many customers use a separate load balancer to handle LDAP failover, WebSphere actually responds faster to multiple LDAP failover servers entered here rather than waiting for a load balancer to return a new host.

WebSphere will attempt to validate all the information on this screen including hostname, bind credentials and port when we save. The next step is to define the Unique Distinguished Name for entries in this repository. Whatever we choose here, it must be unique across all directories.skitch.6

For this directory I could use the value O=Turtle and that will restrict valid users to just those with a hierarchical name containing the Turtle organisation e.g Gabriella Davis/Turtle.  In a Domino directory, unique for LDAP sources, not all entries are hierarchical (groups names for instance are flat) so with this configuration WebSphere won’t recognise any Domino groups.  One option to workaround that is to use the value “root” for the Unique distinguished name which will tell WebSphere to recognise all organisations and even flat names or groups.

skitch

We also need to make sure the Group definitions are correct for the directory type we are using.  For Domino the attribute use for defining a group is called “dominoGroup” not the default “groupofNames” and the member attribute is called “member”.

Once the repository is configured we log out of the ISC and restart the deployment manager, then we log back in and go to Users and Groups – Manage Users / Manage Groups to confirm that all our LDAP users and groups are found and displayed.

If I can’t log back in after a restart I can rollback to the earlier backed up instance of the deployment manager and start over.

skitch.4

Now that the Federated Repository is set up the next thing we want to do is add additional accounts that can administer the ISC, including a LDAP account which will become the Connections Administrator.  Here we are using an account called “connadmin” that I created in the Domino directory.  Once the account is added we can test that it works by logging out of the ISC and attempting to login as “connadmin”.

Creating The Databases For Connections

Before we can install the Connections applications we must create the databases they will use.  The database server can be DB2, SQL or Oracle but the licensing for DB2 is part of your Connections licensing whereas SQL and Oracle would need to be separately licensed.  I tend to use whatever server the customer feels most comfortable supporting, if they are a big SQL or Oracle house I’ll use that.  For our purposes, and if the customer has no preference I like to use DB2.  One note about high availability, the license for DB2 includes the rights to use active/passive HADR for DB2  which means creating two servers which sync with each other but with only one active at a time. To make the active server passive and make the passive server active requires a manual switchover and will entail some amount of downtime.  For additional license cost DB2 offer a full HADR active/active solution but I won’t be discussing that on this blog.

We already installed DB2 so now we just need to run the DBWizard to create the databases.  It’s important to use the db2inst1 account we created during install as “owner” of the DB2 instance (on Windows it’s probably called db2admin) to create the databases. This ensures that the db2inst1 has the right security access to the databases and that later, when the Connections applications attempt to write to the databases, everything works.

Now I can login as db2inst1.  Don’t use “su” as that can throw errors, always login to the server as the db2inst1 (or in Windows db2admin) account.

Setting The Environment

The first thing we should do is make sure the instance of DB2 can run all the Connections databases.  Run the command

db2 update dbm cfg using numdb 20 to allow up to 20 databases to run in this single instance.  That’s a high number but in a reasonably small environment (less than 1000 concurrent users say) it’s fine. 

We also need to set the DB2 instance to use unicode before Connections installs, we can do that whilst we’re here by typing

db2set db2codepage=1208

Then stop and start DB2 to make both the above changes take effect

db2stop
db2start

Creating The Databases

IBM released Day 1 fixes for the DBWizards and you need to find and use these not the ones that shipped with 5.5.  They are on fixcentral and their filenames are

Linux: 5.5.0.0-IC-D1-DBWizard-LO87408_lin_aix.tar
Windows: 5.5.0.0-IC-D1-DBWizard-LO87408_Windows.zip

Using my root account I extract the tar file to a directory like this

mkdir /home2/db2inst1/DBWizard
cd /home/db2inst1/DBWizard
tar -xvf /opt/Software/5.5.0.0-IC-D1-DBWizard-LO87408_lin_aix.tar chown -R db2inst1 * (this recursively makes the file owner of all files in that directory db2inst1)

skitch.9

Launch the ./dbWizard.sh from the terminal session.  This has a graphical interface so if you don’t have one configured you will need to create a response file and run ./db2Wizard -silent -nameofresponsefile insteaad

skitch.8

skitch.7

The same dbWizard is used to create, update and delete databases, it can also just generate the SQL commands to enable you to manually do these activities yourself.  Depending on the complexity of the enviornment and / or if the Wizard starts throwing errors, I often take the SQL commands and manually run them myself so I can monitor and adjust them if necessary.

skitch.6

The second screen asks for (and defaults to) the location of my DB2 install as well as the account that owns that DB instance.  I am running the Wizard using that account to ensure there are no security issues with permissions on the newly created databases.

skitch.5

I have chosen not to create either the Cognos or the Connections Content Manager databases at this time beacuse I will not yet be installing those features.  When I come to install them later I will re-run this Wizard (or the latest version of this Wizard) and create them then.

skitch.2

The final screen shows all the commands that the Wizard is about to run to create each of the databases. These commands point to .sql files that were extracted into the DBWizard directory and hold instructions on how to build each database for each application.  I can save those commands to a file and keep them for reference or use them manually later.

If you want to look at what they are doing, the sql files and routines are under the connections.sql directory in the extracted Wizard location.

skitch.1

The wizard will now run through creating every database and logging the activity to the /home/db2inst1/lcwizard/log/dbWizard directory with one log file for each sql routine.  I like to review the log file activity as it progresses in case there are any issues – the entire process can take anywhere from 30 minutues to 2hrs depending on the speed of your disk.

skitch

Once the Wizard has finished, try to connect to one of the newly created databases from a terminal window to confirm the database is there and db2inst1 has the right access to it.

skitch.10

For example to connect to the Activities database I use
db2 connect to opnact

Next step: configuring LDAP and populating the profiles database

Installing IHS

Connections uses IBM HTTP Server as a front end webserver, this is because WebSphere servers and their applications don’t listen on regular HTTP(S) ports 80/443 – each of the WebSphere servers I install will assign itself dedicated ports (see the earlier post on WebSphere installation).  If I install three WebSphere servers I will have three secure and three non secure ports that the applications may be listening on.  Blogs may be listening on a totally different port than Communities for instance.  As you can imagine that would make it very difficult to manage and route client requests.  By putting a webserver in front of the applications I can route all traffic through the standard HTTPS/443 port.

IBM HTTP Server is part of the WebSphere Supplemental installers.  I extracted those files and configured my Installation Manager (IM) repository to find them

skitch.9

IM now has references to two repositories, the WebSphere installers and the WAS Supplemental installers.

skitch.8

There are a lot of programs in the supplemental package but I want to find IBM HTTP Server 8.5.5.6 so I use my filter at the top to help me find it.  Make sure the “show all versions” checkbox is ticked so earlier versions like fixpack 6 display.  I also want to find and install the Webserver Plugins for WebSphere as we’ll need these to integrate IHS with the installed Connections applications later.

skitch.6

Since I’ve chosen two products to install I need to set two directories to install into. The first directory is for the IHS server itself and this defaults to /opt/IBM/HTTPServer, you can change this if you want. I usually don’t unless there’s an issue wtih the /opt partition.

skitch.5

To verify and change the second directory for the web server plug-ins I have to click on that entry in the package group screen.  The plugins can be installed anywhere but I’ll need to remember where I put them for later !

skitch.4

On the next screen I can choose the features to install.  The installer will default to the correct architecture for the OS

skitch.3

The standard webserver port is 80 (this is for HTTP not HTTPS).

skitch.1

.. aaanndd IHS is installed.

I can’t run it yet and my work isn’t done.  I still have to

  1. configure the admin interface
  2. add the webserver to our ISC
  3. configure the http server
  4. create a SSL keyfile and enable SSL

I’m going to go ahead and complete steps 1 and 2 now and come back to steps 3 and 4 after I have Connections installed.

Configure The Admin Interface

To set up the admin interface I need to have an account in Linux I can use.

adduser ihsadmin

I need to set the admin password for IHS. From the directory /opt/IBM/HTTPServer/bin I type

./htpasswd -c /opt/IBM/HTTPServer/conf/admin.passwd ihsadmin

and I will be prompted to set the admin password for ihsadmin

The command to start the admin interface is ./adminctl start but that won’t work until I complete the admin.conf configuration in the /opt/IBM/HTTPServer/conf directory

vi admin.conf

search for @ symbols that you will need to replace with ports, for example

skitch

skitch.20

The default admin port is 8008.  To verify if I have the configuration file correct I type

/opt/IBM/HTTPServer/bin/adminctl configtest

To start the admin server I use

/opt/IBM/HTTPServer/bin/adminctl start

To verify it has started type netstat -tulvn |grep :8008 and I should see the server listening on that port.

Adding The WebServer To ISC

Log into the ISC at https:<hostname>:9043/ibm/console

skitch.19

Choose “Nodes” under “System administration”.  Currently I have two nodes for the Deployment Manager and the Application server profiles I already created on the previous post where I installed WebSphere. I want to “Add Node” to add a new node.

skitch.18

Since I’m not adding a WebSphere server but an IHS web server there’s going to be no way for WebSphere’s Deployment Manager to actually manage it so I need to add this as an “unmanaged node”.  Only WebSphere servers can be added as “managed nodes”

skitch.17

My unmanaged node properties are the

  • Name – which is only used within the ISC itself and never presented to the users so can be anything unique
  • Host Name – the host where the IHS is installed.  This hostname must be resolvable from the Dmgr.  In this case I installed the IHS on the same server as the Dmgr so they share a hostname
  • Platform

skitch.16

Once completed my new unmanaged node is shown in the list of nodes but as you can see the status column is empty because that column refers to WebSphere sync status and this node is not for a WebSphere server.

skitch.15

Now I’m ready to add my new IHS server on the new unmanaged node to ISC.  Go to Servers – Server Types – Web Servers and select “New” to create a new Webserver instance

skitch.14

Now I can choose the unmanaged node I just created and name the server (choose webserver1 as the name if you can since WebSphere often “assumes” that) and then choose the “Type” (IBM HTTP Server always)

skitch.12

On the next screen I need to tell WebSphere how to connect to the admin interface of IHS.  I just set that up using account ihsadmin and port 8008.  If the admin service isn’t started or the ihsadmin account isn’t set up correctly WebSphere won’t be able to “talk” to IHS at all..

skitch.13

Since I chose IBM HTTP Server on the earlier screen WebSphere knows there’s only one template I can use.

skitch.11

My install is now complete and my IHS web server is now part of my WebSphere cell.

skitch.21

The WebSphere Integrated Solutions Console

Once WebSphere is installed I want to do two things, login to the admin interface and backup the environment so I have a “start point”

WebSphere’s admin interface is called the Integrated Solutions Console and is the same regardless of what product you are working with.

To login to the ISC

https://bacall.connections101.info:9043/ibm/console

:9060 which is the non secure port will redirect to 9043

skitch.23

Login here using the credentials created during the original WebSphere install, these credentials are created and stored in the WebSphere default repository during install (and will always be your “backdoor” into the server).

skitch.22

Once logged in I know that WebSphere is correctly installed and running and I can logout and move onto my next task and come back to here later.

Backup

At this point, and before I start further configuration I like to take a backup of the WebSphere environment.  This is simple to do using the backupconfig command.

From the Dmgr profile location
/opt/IBM/WebSphere/AppServer/profiles/Dmgr01/bin type
./backupConfig.sh /opt/Backups/startpoint.zip -nostop

The location and filename of the backup can be anything.

The -nostop option tells WebSphere not to stop its services before it does a backup.  By default it will stop the Dmgr to avoid anyone making any updates whilst it is backing up but since I know I am the only one working on here I’m happy to take a “live” backup instead.