Locating Computer GPO Registry Values

I come across this scenario all the time that requires a HKLM registry setting to be configured. Typically this can be implemented via Group Policy but, for whatever reason, you which to set the resulting registry value directly. It might be because you don’t wish to cut a new GPO just for a couple of servers or workstations. A common requirement is to set the RDS licensing server as part of an automated deployment. Maybe you use RES Automation Manager like we do. However, this scenario is not limited to just RES Automation Manager. You could use the information in this post to configure a few specific settings as part of a WDS deployment for example.

Hopefully by now you are all familiar with the free Virtual Engine Toolkit (VET). No!? Shame on you! I suggest you take a look over here and see how it can help migrate from a unmanaged user environment to a managed one.

So you now know VET is especially good at converting user related GPOs into .REG files that can be imported in your UV/UEM tool of choice i.e. RES Workspace Manager or AppSense Environment Manager. One of VET’s hidden talents (and undocumented until now) is we can also convert computer related GPO’s into .REG files.

Using the settings above as an example I’ll run you through how we achieve this with RES Automation Manager and not in a GPO. If you’ve read our series on user GPO migration then you’re aware that GPO settings (not all!) are just registry settings. The problem we normally have, is where and what should these values be set to?

You could at this point download the Microsoft Group Policy Settings Reference guide and find the individual registry keys. You could use the Group Policy Search which Kees Baggerman spotted and pointed out in this blog post Winking smile. You can spend time Googling them at which point you would have to start manually adding them to the registry task in AM. But its much, much simpler to use VET!

NOTE: the same process could be used for migrating multiple existing computer related GPOs into AM but please be aware that the computer will probably need a reboot before the targeted settings come into force.

  1. First thing to we need to do is create a Dummy GPO where we can set the various policies we’d like included in AM. In my example I’ve called my GPO “Dummy GPO for VET” and configured the settings we’d like to apply as in our example above.SNAGHTML20d08061
  2. Next we need to launch VET and use the “Convert Group Policy Objects Wizard” to scan the SYSVOL folder for our newly created/existing GPO. Once VET displays the list of GPOs select the one that you wish to convert then click “Next”
    .image
  3. Select “Use subfolders for User and Machine policies”. Deselect “Also create RES Workspace Manager Building Block Files” then click “Next”, “Next” and “Finish”.image
  4. Looking in the “Documents\Machine” folder you’ll find the newly created .REG File containing our settings.
    image
  5. Now launch the RES AM console and create a new module which contains the “Registry Settings (Apply)” task. Its then a case of importing the .REG File created previously; so you should end up with something looking like this.
    image

It’s as simple as that! We’ve used a dummy GPO that is not applied to any computer objects, set our required settings and imported the exact resultant registry values into RES Automation Manager. You can probably think of other great use cases for this too.

You never know we might incorporate the ability for VET to generate RES Automation Manager building blocks in the future.. Hope this little gem helps someone in the future like it has me!

Nathan

RES AM Passing Values Between Scripts

You don’t need to be told how great RES Automation Manager, but there are some things that we can only achieve via scripts; be it VBscript or PowerShell. In my example, it is scripting XenDesktop and XenServer for the demo showcase platform (more on this at a later date). There is currently no way to automate these products without using scripts. Unfortunately (for me) it’s always been problematic to pass values in and out of scripts to other modules. We can certainly pass a value into a script, but then we can’t return it to be used elsewhere.

My problem required creating an AD user (not via the built in task) in one script and then passing the username/password into another script. To overcome this particular issue, I started down the route of temporarily writing the information to the registry so that it could be read by the other script later in the Project. This is where I stumbled across a little gem hidden in RES Automation Manager. I don’t know whether it’s intentional and/or undocumented, but it certainly works!

I attempted to use a Parameter using the built-in @[REGISTRY] Function. In essence this instructs the RES Automation Manager agent to populate the Parameter with the contents of the registry key. This bit is simple to understand and you probably already knew this. However, what I didn’t realise is that this Parameter is updated/re-evaluated at every task within a Module. I assumed that it would only be evaluated when the Module is invoked by the RES AM agent. I’m certainly glad that this is not the case as we can now write values to the registry and AM will automatically pass the updated value to the next Task(s).

[wpdm_file id=8]

Here is an example building block that contains a single module with a single registry-based, emtpy Parameter value. The first script writes the current date to temporary location in the registry (just so happens to be where RES Automation Manager is reading the Parameter value from). The second script receives its Parameter value from RES AM (not directly from the registry within the script), adds a day (in US format!) and writes it back to the registry. The final task displays a pop-up message with tomorrows date from the RES AM Parameter.

What this does prove is that the Parameter is re-evaluated before each task is executed and therefore passed through all tasks. Never in this example module do we enter the date. Here is the status of the parameter before and after each task.

Task 1 – BEFORE: <Empty>, AFTER: <Empty> (We write today’s date to the registry, but it’s not re-evaluated until the next task)
Task 2 – BEFORE: <Current Date>, AFTER: <Current Date> (We write tomorrow’s date to the registry, but it’s not re-evaluated until the next task)
Task 3 – BEFORE: <Tomorrow’s Date>, AFTER: <Tomorrow’s Date>

I’m sure you can think of more ingenious ways of using this functionality. Enjoy! Iain

BigHand Digital Dictation and RES Automation Manager – License file update

Following on from the BigHand Digital Dictation and RES Automation Manager – BHF File Locations post, I’ll show you how easy it is to update the BigHand license files across all BigHand servers in your estate with RES Automation Manager.

Managing your BigHand Digital Dictation system using RES Automation Manager

BigHand is a powerful dictation product used in many environments, but most prevalent in the Legal and Medical sectors.  With enhancements such as speech recognition and workflows more and more users are relying on this business critical tool. In these posts I will go through the automation of two of the most common tasks required on the system; Updating license files and locating a dictation file.

Update license files

License files are supplied by BigHand as a .lic file.  They must be located in a particular location on every BigHand server in your environment.  If you have a BigHand file store server in every site, BigHand gateway for telephony or Mobile client there can quickly become lots of servers requiring a license file, this is a time consuming process which can easily be automated and here’s how.

Stop services
All the BigHand services should be stopped before you can update the license file.  These need to be done in a particular sequence.  For this you can use Service properties (Manage) task in the Configuration folder.

  1. Stop Branch site BigHand services (If you only have 1 BigHand server you can ignore this)
    a. On the settings tab select Change Service State
    b. In service name enter BigHand Server x.x (where x.x is the version number) or you can connect to the server using the browse button and select the service
    c. In New Service State select Stop service.
    d. You need to set a condition on this task so that it only runs for branch sites, so select the condition tab.
    e. Under expressions select Add and then Registry Setting
    f. Click the browse button next to the Operand field and connect to your Master BigHand server’s registry.  Drill down to the following key and double click: HKEY_LOCAL_MACHINE\SOFTWARE\BigHand\BHServer_x.x\ServerGuid
    g. The Operand and Value will be added to the fields, ensure the operator is equals and click Ok.
    h. Select “If the condition is TRUE then Skip this task”.
    i. Select “If the condition is FALSE then Execute this task, but skip all remaining tasks in this module” and click OK.
  2. Stop all Master services
    a. Create another 5 Manage service tasks and repeat steps a – c for the following services (the order is important to avoid conflicting dependencies):
    i. BigHand Gateway service
    ii. BigHand External Workflow server service
    iii. BigHand Active directory service
    iv. BigHand Services host service
    v. BigHand Server x.x service

Backup old license file and copy new license file

  1. Click Add and Select Files (Perform operations) from the configuration folder
    c. On the settings tab click add and select Delete as the action type
    d. In the source path enter C:\Program Files\Common Files\BigHand\BHServer.old (Default location) and click OK.
    e. On the settings tab click add and select Rename as the action type
    f. In the source path enter C:\Program Files\Common Files\BigHand\BHServer.lic
    g. In the Destination path enter C:\Program Files\Common Files\BigHand\BHServer.old and click OK.
    h. Ensure Ignore Errors is checked and click OK.
    i. Click Add and select Resource (Download) from the provisioning folder
    j. On the settings tab click Add and browse to the license file resource
    k. Check specify destination folder, in the destination enter C:\Program Files\Common Files\BigHand\  and click OK.
    l. Clone both of the File tasks you have just created, open them and change the source and destinations to C:\Program Files\BigHand\BigHand Services\
    m. These two file tasks will need conditions so they only run on the master server, click the Conditions tab and click add
    n. Select registry setting and add the value from 1f above.
    o. Select If true Execute and If false Skip task and click OK

Start services

  1. For this you can use the Service properties (Manage) task in the Configuration folder again, except this time you must select Start service as the new service state.  The jobs must run in the following order:
    a. BigHand Active directory service
    b. BigHand External Workflow server service
    c. BigHand gateway service
    d. BigHand services host service
    e. BigHand Server x.x service (Master site)
    f. BigHand Server x.x service (Branch).
  2. Be sure to set a condition on these using the BigHand server GUID as before, that way the job will not try to start services that do not exist.
  3. That’s it, you’re done.
  4. Finally, you could configure a job to check each service and email the results.

BigHand Digital Dictation and RES Automation Manager – BHF File locations

Whilst this post is targeted at retrieving archived BigHand dictation file locations, it does demonstrate how we can use RES Automation Manager parameters and its Microsoft SQL integration to good effect.

Locating a Dictation File

BigHand holds dictations in a proprietary format with the extension .BHF.  This is essentially just a WAV file with minor modifications.  Once a user has completed a dictation it will disappear from their view after a pre-determined number of days.  If you have a reason to retrieve one of these files once it is removed or if a system corruption means you need to re-import dictations there is no easy way to identify the file, as the files have GUID filenames and can be sitting on any file store server in your BigHand estate.  Using a combination of SQL and Automation Manager you can make everyone’s life easier.

Create SQL View

First you will need to create a SQL view on your BigHand SQL instance called Find_Dictation_Location as per the below script:
SELECT     a.BH_FirstName, a.BH_LastName, a.BH_UserName, b.BH_Title, CONVERT(varchar(10), b.BH_CreationDate, 103) AS BH_CreationDate,
b.BH_CompletionDate, b.BH_Deleted AS BH_Deleted_Task, b.BH_Destination, b.BH_Description, b.BH_MatterNumber, b.BH_DocumentType,
b.BH_Confidential, b.BH_FileRequired, b.BH_Open, c.BH_FileGuid, c.BH_Location, c.BH_Version, CAST(d.BH_URL AS nvarchar(4000))
+ CAST(SUBSTRING(c.BH_Path, 2, LEN(c.BH_Path)) AS nvarchar(4000)) AS Path, c.BH_Deleted AS BH_Deleted_File
FROM         dbo.BH_Users WITH (nolock) a INNER JOIN
dbo.BH_Tasks WITH (nolock) b ON a.BH_UserGuid = b.BH_Author INNER JOIN
dbo.BH_FileLocations WITH (nolock) c ON b.BH_TaskGuid = c.BH_FileGuid INNER JOIN
dbo.BH_Locations WITH (nolock) d ON c.BH_Location = d.BH_Location

Automation Manager SQL Query

Next you need to run a SQL query against that view to retrieve the URL of the file, but the SQL query needs to be edited every time you use it, so let’s create an AM job to do this.
1. Create a Module and Add a SQL Statement (Query) task from the advanced folder
2. On the settings tab add the following SQL query:
select BH_UserName,BH_Title,BH_CreationDate,path,BH_Deleted_File from Find_Dictation_Location
where bh_username like ‘%$[Username]%’ and
BH_Title like ‘%$[Title]%’ and
BH_CreationDate like ‘%$[Date]%’
Note the parameter references in the SQL query ($[Username], $[Title] and $[Date]).  These are the fields required to identify the dictation.  Obviously these can be changed to any of the fields available in the SQL view.
So that AM can insert the correct variables into the script you will need to create parameters for:
a. Username (Username)
• This will allow a full or part entry of the user who created the dictation
b. Dictation title (Title)
• This will allow a full or part entry of the Dictation title
c. Time and Date (Date)
• This must be the exact creation date; you can set a mask for the parameter on the input tab in the format 00/00/0000.
Once these are created, you will be able to run the query against your SQL instance.  It will collect information on the dictations you specified by looking up the relevant GUIDs in the database and create a field entry (Path) that has the URL built up from the separate BH_Locations and BH_FileLocations tables.  This will then be presented in the Job results for the query.
View results in Automation Manager Job History

Change RDS User Logon Modes using RES Automation Manager

[wpdm_file id=6]If you are using Windows 2008 R2 Remote Desktop Services you might have noticed that there are various user logon modes available on the Remote Desktop Session Host; which you can see from the screen shot below:

These are all well and good should you wish to manually change the user logon modes i.e. Allow reconnections but prevent new logins until the server is restarted for say maintenance purposes. But if you are already using RES Automation Manager why not complete this task in a more automated fashion? Well this can be easily achieved by adding the following tasks and module parameter.

Module Tasks:

image

Module Parameters.

RDSConnModParam

Module Task 1 (Enable Logons):
  1. Add Task > Remote Terminal Server Logons.
  2. Settings > Enable Logons.
  3. Condition Expression > User Logon Mode = 1.
  4. Condition Expression > Computer Function = Terminal Server.
  5. If condition is TRUE then > Execute this task, but skip all remaining tasks in this module.
  6. If condition is FALSE then > Skip this task.
Module Task 2 (Disable Logons) :
  1. Add Task > Remote Terminal Server Logons.
  2. Settings > Disable Logons.
  3. Condition Expression > User Logon Mode = 2.
  4. Condition Expression > Computer Function = Terminal Server.
  5. If condition is TRUE then > Execute this task, but skip all remaining tasks in this module.
  6. If condition is FALSE then > Skip this task.
Module Task 3 (Allow connections) :
  1. Add Task > Registry Settings (Apply).
  2. Settings > Add the following registry value.
  3. [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Terminal Server]
    “TSServerDrainMode”=dword:00000000.
  4. Condition Expression > User Logon Mode = 3.
  5. Condition Expression > Computer Function = Terminal Server.
  6. If condition is TRUE then > Execute this task, but skip all remaining tasks in this module.
  7. If condition is FALSE then > Skip this task.
Module Task 4 (Allow reconnections, but prevent new logons) :
  1. Add Task > Registry Settings (Apply).
  2. Settings > Add the following registry value.
  3. [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Terminal Server]
    “TSServerDrainMode”=dword:00000002.
  4. Condition Expression > User Logon Mode = 4.
  5. Condition Expression > Computer Function = Terminal Server.
  6. If condition is TRUE then > Execute this task, but skip all remaining tasks in this module.
  7. If condition is FALSE then > Skip this task.
Module Task 5 (Allow reconnections, but prevent new logons until server is restarted) :
  1. Add Task > Registry Settings (Apply).
  2. Settings > Add the following registry value.
  3. [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Terminal Server]
    “TSServerDrainMode”=dword:00000003.
  4. Condition Expression > User Logon Mode = 5.
  5. Condition Expression > Computer Function = Terminal Server.
  6. If condition is TRUE then > Execute this task, but skip all remaining tasks in this module.
  7. If condition is FALSE then > Skip this task.

Once you have created the module, when you come to schedule the job its then a simple matter of selecting which logon mode you would like to apply from the job parameters. Of course you can schedule the job on individual, multiple or teams of agents.

image

To make life even easier I’ve created a handy building block that you can import into your environment.

[wpdm_file id=6]

Any questions just ask.

Enjoy

Nathan

Patch Management with RES Automation Manager

It’s a question that I get asked; lots. As you may well know my stock answer to nearly all questions is “YES! However…” In this particular case I’m going to stick with my stock answer and I’ll explain why here.

Using RES Automation Manager to manage application deployment and patching is absolutely the right thing to be doing, but not for Microsoft products, i.e. Windows and Office updates etc. RES Automation Manager provides us with the advanced scheduling capabilities that are required when deploying updates. We can send feedback to users, control reboots and also create our own dependencies and prerequisites using the built-in Conditions. Creating installation and/or update modules for Adobe Reader, Flash and Java etc. is a fairly straightforward process.

So why can’t we use RES Automation Manager for MS updates? Let’s get one thing clear – there is nothing stopping you from doing this. However, do you really (and I mean REALLY!) want to be creating modules for every hotfix, patch and Service Pack that Microsoft releases? What about controlling installation orders and prerequisites? What about testing and ensuring the update is really needed before attempting to install it? When I last checked WSUS for Windows and Office updates I think there was more than 20,000+ updates. Hopefully you get the point!

What can we do then? By leveraging the metadata that Microsoft has already created we don’t need to and I’m guessing the product teams inside Microsoft are the best people to create this information! We use WSUS internally to manage our patching for Microsoft products on clients, servers, hosted servers and training labs. We recommend that our customers either use WSUS or an existing patch management tool if one is already in place. We then manage all other application installations and updates as Modules within RES Automation Manager and use its advanced scheduling capabilities to push them out.

Surely if we’re going to implement WSUS for Microsoft patching, it would make sense to patch all our products with an extension to WSUS? Not necessarily. Remember you’re getting a whole load more with RES Automation Manager than just pushing out patches; think Run Books and Evaluators. We can deploy software and machine configurations, provision resources such as Active Directory accounts and Exchange mailboxes and a whole lot more.

For those that want to extend WSUS there are products available in the market to integrate 3rd party patches such as Adobe and Java. Check out the EminentWare web site for one such example. You can roll your own application updates as well with the free Local Update Publisher on SourceForge.

At the end of the day, if you have to package a single internally developed application or products not supported with 3rd party tools you might as well extend that list to common middleware such as Flash and/or Java as save the money on the 3rd party integration. Leverage WSUS for what it’s good at and use RES Automation Manager for what it’s very good at; the rest!

Migrating RES Databases to a New SQL Server

We have recently had the requirement to move the SQL databases to a new Microsoft SQL Server 2008 R2 server. This process is not complicated, but there does seem to be a lack of documentation available. At a high level we need to perform the following actions:

  1. Backup the RES Workspace Manager and Automation Manager databases on the source server;
  2. Restore the RES Workspace Manager and Automation Manager databases on the destination server;
  3. Fix the SQL permissions, i.e. recreate the users and redelegate access;
  4. Update the RES Automation Manager Dispatchers to point to the new database server;
  5. Update the RES Workspace Manager Agents to point to the new database server.

I’m not going to cover Steps 1 and 2 as these are well documented on Microsoft’s web site and many other various blogs. In this particular instance we’re moving from SQL 2008 to SQL 2008 R2 and I’ve restored copies of the Workspace Manager and Automation Manager databases on the new server. The SQL user account for the Workspace Manager database (RES-WM) is ‘RES-WM’ and the user account for the RES Automation Manager database is ‘RES-AM’ (note that naming the database and user accounts the same is not best practice but it helps in our lab environment!).

Migrating RES Automation Manager

We’ll start with the RES AM database as we’ll then use this to update the RES WM information! Firstly we need to check that the correct user permissions have be granted on the new database server. When creating the SQL user accounts you’ll need to ensure that the password policies are set correctly:

In short make sure that the user password policies are disabled (unless you want to be forever updating your Dispatchers!). If you forgot to uncheck this and you can’t seem to change it, you can run the following SQL script via the SQL Management Studio (remember to change the RES-AM reference to your SQL user account!):

USE Master
GO
ALTER LOGIN “RES-AM” WITH PASSWORD = ‘samepassword’
GO
ALTER LOGIN “RES-AM” WITH CHECK_POLICY = OFF, CHECK_EXPIRATION = OFF;

After this is complete you will need to ensure that the RES-AM user account has DB Owner (DBO) rights to the database via the “User Mapping” page of the user account:

Once we’re happy with this we can focus our attention on the RES Automation Manager console. As it’s only the RES AM Consoles and Dispatchers that talk directly to the SQL database, we do not need to worry about the RES AM Agents. From the RES AM management console select the Infrastructure > Setup > Database node and enter the new SQL database server name (if the username/password has changed you can update them here).

After you click the Connect button the management console will reload and ask if you want to use the connection information permanently.

The management console will reload. The final piece to this puzzle is to update all the Dispatchers and Consoles. From the Infrastructure > Engines node we need to repair each Dispatcher and update the SQL connection information.

After the Dispatchers have been updated don’t forget to update the consoles in the same manner. You could also run a registry job via AM to update the connection information (remember to do this from the old database as the Dispatchers will communicating with the old database until updated!) or push out a new MSI from the Components node.

Migrating RES Workspace Manager

Note: RES Workspace Manager 2011 has the built-in ability to migrate the exisiting database to a new SQL server and/or database; this can be found in Setup > Datastore > Connections > Click on the ‘…’ next to the Primary datastore to display the migrate wizard. At the end of the migration process after various other prompts you will also be prompted if you wish to create a handy building block that can be used in RES Automation Manager Module.  You can use this Module to migrate RES Workspace Manager Agents running an older version of RES Workspace Manager, not yet containing the Datastore Migration Wizard. The only downside using the migrate method I’ve found is the fact you have to activate the licenses again; if this going to cause some issues follow the procedure set out below. [Nathan Sperry]

RES Workspace Manager is slightly different as all RES WM agents talk directly to the SQL server rather than via a Dispatcher. After migrating the RES Workspace Manager database (as above) and fixing the user permission we need to update the  RES WM agents’ registry settings  via RES Automation Manager! For this task I created a module that updates the required registry value and restarts the RES Workspace Manager agent service.

You will notice that there are two Reigstry Settings tasks with conditions; 1 is for 64-bit machines and the other for 32-bit. Note: if the authentication details have changed you’ll need to the relevant registry settings to both Registry tasks.

Note: if you have a mixture of  freshly deployed RES Workspace Manager agents and agents upgraded from RES PowerFuse 2010 or earlier then the registry settings are in different locations and you may have more tasks/conditions!

Iain

Add computers to a local group using RES Automation Manager

One of the many built in tasks that are provided in RES Automation Manager is the ablility to add users or groups to a local group on a server or desktop. The task does what it says on the tin but one unknown fact (well to me at least until I tried it) is you can also add computers to a local group too using the same task. A very simple example of this would be when using Remote Desktop Services and the Remote Desktop Connection Broker. In this example each Remote Desktop Session Host that is participating in the farm needs to be added to a local group on the Remote Desktop Connection Broker called “Session Broker Computers”.

To add a computer to the local group you simply need to add a dollar ($) after the computer name in the “User(s) and/or group(s) to add” field as you can see in the screen shot below.

It’s as simple as that!!

Nathan

Updating your XenApp farm using RES Automation Manager

When publishing an application across multiple servers in a XenApp farm one of the key elements to a trouble free environment is having consistency across the farm.  RES Automation manager can help with getting this right.

So, you have your software package imported into Automation Manager and you deploy to each XenApp server in turn.  Here I will go through a method of updating that software and running maintenance automatically with no outage and (most importantly) an easy life for everyone.

First we should set out exactly what we need to do.  There are several stages to this process:

  1. Disable access to the XenApp server
  2. Ensure users are not logged on.
  3. Switching to install mode.
  4. Check version and run Installation.
  5. Switching to execute mode.
  6. Enable logon to the server.

These stages can easily be broken down into 3 Automation Manager Modules; let’s take a look at how to get them setup.

Module 1
Disable logons, wait for the server to be free from users and switch to install mode.
We have three tasks to run here:

Task 1 – Disable logons
Click Add to create the first task of the module.  Select Remote terminal server logins (Change) from the configuration folder and select :
Disable remote logins

Make sure you name the task appropriately and you’re done.

Task 2 – Wait for the server to be free from users
Best practice when installing software on a XenApp server suggests that you should not have any user sessions on that server.  Therefore we need to wait for users to leave.  Now I’m not all that interested in sitting in front of the Access Management Console hitting refresh every 5 minutes until everyone is off, so let’s get AM to do that for us!

We need to make use of the PowerShell capabilities for this, so click Add and select Windows PowerShell Script (Execute) from the advanced folder.  One of the recent improvements to AM is the ability to input a script directly into the task, this helps keep the whole process in a single place.  You can also easily save the script as a resource and point the task to it (good if you have a single script being reused in multiple tasks).

On the settings tab select “Use Windows PowerShell script from “Script” tab” to enable the Script tab, or point the task to the .ps1 file if you have already saved it as a resource.

By default, the PowerShell scripts that you run using this Task need to be digitally signed.  Select Override execution policy for this Task to temporarily lower the PowerShell execution policy to “Unrestricted” and use an unsigned PowerShell script.  After execution of the Task, the PowerShell security will be reverted to the previous security level.

Next you will need to set the timeout.  Since you want the script to run until all users are logged off you need to set this to the maximum allowed, this being 300 minutes.  It would be nice if there was the option to disable the timeout if required, but there isn’t at the moment so 5 hours will have to do for now.

Once you’re done here you’ll need to click the script tab and add the script.

The Script

The script checks the session count on the XenApp server every 5 minutes.  If any user accounts are disconnected, it will log them off.  The script will loop until there are zero sessions or the timeout is reached.  This is key to allowing a RES AM job that waits for a condition to occur before moving on, since the AM native conditions will only be evaluated once at the start of the job.

Although you will not be able to view the output of the script while it is running, it is kept in the job history to help with diagnosis.

Script File (Zip)

Task 3 – Switch to install mode
The final Task in this Module will switch the XenApp server to install mode.  Using the Command (Execute) task from the advanced folder we will use the following command:

Change user /install

Make sure you tick Use Windows command interpreter.

This command is not strictly required anymore, XenApp is intelligent enough to recognise an install process and switch to install mode automatically, but since AM uses the system account and since you won’t always want to just run an msi or an exe it’s better to set it and be sure.

At this point your XenApp server is ready to accept whatever installs and modifications you need to apply.  You could use this set of tasks and finish with an email notification telling you the server is ready to manually have other modules run against it.  However, we are looking for a fully automated process.

Module 2
Check version and run Installation, configuration etc.

Module 2 is the set of tasks you want to run against the box.   Here I am going to use an MSI that I built using a combination of VSS and WixBuild to demonstrate a fully automated software update process.

To start with, I save the MSI as a resource.  The resource type should be “stored in datastore”, this way AM assigns a GUID to the resource, and I will explain why you need this GUID later.

Next I need to add a new Module and create the task, in my case this was a Windows Installer package task from the Provisioning folder.  On the settings tab click the Browse button next to the Filename field and select the resource, configure any other settings (such as Properties or Transforms on the Parameters tab) and click OK.

Note:  With MSI Installs I would always recommend using the Log tab to set the required level of logging and click “Remember as Default”.  This way you will have the installer log files available in the job history should you ever need to diagnose an issue.

Once you have added all the required tasks (including any reboots needed) you are almost ready, just one final module to create.

Module 3
Switch to execute mode and enable logon to the server

This is the final module.  All you need now is for the XenApp server to be made available.  For this you will need a module with 2 tasks (as in Module 1) with the following:

Task 1 – Command (Execute), but this time we will run “change user /execute”

Task 2 – Remote terminal server logins (Change).  This time we are going to Enable remote logins.

Run Book

Run Books are used to create a chain of jobs; each job can be run on a different agent in the same run book.

Add a Run Book, then on the Jobs tab add Module 1 (Disable logons, wait for the server to be free from users and switch to install mode), select the XenApp server as the agent and Click OK.  Repeat this for Module 2 (Check version and run Installation) and 3 (Switch to execute mode and enable logon to the server).

Once these are added the jobs can be cloned and the Agent name changed so that you have Modules 1, 2 and 3 running on each XenApp server in turn or use Teams and split the job to schedule the upgrades/updates in batches.

Schedule and New versions

This Run book can then be scheduled to run at an appropriate time using Job scheduling.

Once the schedule is in place all you need to do to update the Citrix farm is open up the resource files that Module 2 points to and update them.  Since AM has assigned a GUID to the resources the new files will automatically be associated with the task.  Next time the Job runs each Citrix server will disable logins, wait for each user to log off (or log off disconnected sessions) run the new MSI to update the software and re-enable itself.

You meanwhile, can sit back and relax.

If you want to avoid running the Modules during every schedule (as there may not have been an update to the software) then you can use a combination of evaluators and conditions to ensure that the specified tasks/modules do or do not run as required. Make sure the first task is an Installed programs query (found in the System state folder), configure an evaluator that checks for the latest version number and sets a parameter to “True” or “False”.  Once this is in you then set a condition on each individual task to run dependant on the evaluator.  Using this method you can quickly build up a single Run book that runs all your regular Citrix maintenance.

Dan

Guest Blogger: Dan Beeson

It’s my pleasure to introduce Dan Beeson who will be a guest blogger on the Virtual Engine blog. Dan is one of a handful of techies in the UK to be rewarded with the RES Software Valuable Professional (RSVP) title. Needless to say this is something that is rewarded and not something you can buy (not that I’ve asked!). He currently works for a law firm in London and has vast experience deploying and managing desktops.

Dan has been using RES Automation Manager for many years and also has recent experience with Service Orchestration in a production environment. As Service Orchestration is “new in 2011” there are not many people who are in a position in the UK to deploy it to its full potential. Therefore, there will hopefully be some great blog posts in the coming months.

Nathan and I would also like to take this opportunity to personally thank Dan for a) taking the opportunity to help the community and b) for taking the time to do this. We both know how much time and effort is needed to write good blog posts, hence the lack of recent activity.

Rest assured that there are many things in the pipeline. In the meantime, Dan will hopefully be filling in the gaps!

Thanks, Iain