This project has moved and is read-only. For the latest updates, please go here.

Configuration of Multi-Server Farm

Oct 1, 2012 at 1:33 PM

Hello everyone,

concerning all the stuff the is done by the autoinstaller there one thing I do not understand, although I reviewed all docs and guides.

For a centralized deployment, what is the correct process for getting the Prereques and Binaries installed on multiple servers? Where do I tell the script what servers I should take care of?

Can anyone please point this out for me?

Many thanks


Oct 2, 2012 at 3:56 AM

You don't actually need to do anything special in the XML to get prerequisites and binaries installed on specific servers. Just having a particular server's name included in the appropriate place in the XML (for example, in one of the <provision> attributes is enough to have it included in the farm, and in the scope of the remote install, so therefore it will have the prereqs and binaries installed (since this is required to be a member of the farm, after all).


Oct 2, 2012 at 11:01 AM

Hello Brian,

I already figured out something like that in the meantime. However a few more words in the readme would help to avoid some confusion J.

But there are still some points:

  1. Given the scenario of 2 SharePoint servers (and a separate SQL of course), what is the best procedure? Use a separate config file per server or one config file for the user domain? Honestly I do not understand the difference between the two options yet.
  2. The binary installation does not kick off in parallel as I expected, since I set the <RemoteInstall> and <ParallelInstall> parameters to true. What I’m missing here?

Many thanks in advance,


Nov 8, 2012 at 3:06 AM

1. Use the same XML config file for all servers. There is no real need to use a separate XML file for each server anymore.

2. The binary install actually only kicks off in parallel for remote servers; if the server you're launching AutoSPInstaller on is the first server in your farm, it will actually complete the whole script locally first before it attempts remote/parallel installs. If you want true parallel install for all servers simultaneously, you can kick off AutoSPInstaller on a server that's not part of the farm (e.g. the SQL server, or any other server with connectivity to the rest of the servers.)


Apr 26, 2013 at 8:06 PM
I am attempting a centralized deployment as well. The script runs fine up to the point of installing prerequisites on the first remote server:
  • Prerequisite Installer completed in 00:14:11.
    WARNING: 2013-04-26 14:47:41 - Error when enabling ASP.NET v4.0.30319
    WARNING: 2013-04-26 14:47:41 - Last return code (1)
  • A known issue occurred configuring .NET 4 / IIS.

- Setting AutoSPInstaller information in the registry...

  • Script halted!
  • One or more of the prerequisites requires a restart.
  • Setting RunOnce registry entry for AutoSPInstaller...
  • Setting AutoAdminLogon in the registry for Domain\SPSAdmin...
  • Disabling User Account Control (effective upon restart)...

- The AutoSPInstaller script will resume after the server reboots and Domain\SPSAdmin logs in.

| Automated SP2013 install script |
| Started on: 4/26/2013 2:33:26 PM |

| Aborted: 4/26/2013 2:47:42 PM |

Press any key to exit...

I haven't pressed any key yet but noticed the script is continuing to run on the remote server after it restarted itself (I never logged in I am just monitoring the console). What is the best thing to do here? If I press any key I'm fairly certain the script will continue by installing and configuring the next sever...

Apr 28, 2013 at 2:06 PM
Hi Joe,

Where are you running the script from? App server? SQL server?

Let the script finish on the remote server, it will get to a point where it says "binaries have been install and press any key to configure the farm"

At this point you can either run the farm config for that server or go back to the controller server and re-run the script from there. It will run through the script again on the first server but skip all the parts that have already been installed and you will see the same window saying the above again. Once completed it should then kick off the script on the second server.
Apr 28, 2013 at 6:07 PM
I ran it from the App server where central admin will be installed. I did not do anything until the script threw an exception on the first remote WFE trying to configure the next remote!?! After that happened, I figured it was safe to press any key on the app server - it continued by remotely installing the second remote server (until it blew up because of the same issue as my original post). The farm got configured properly as far as I can tell so far - it just wasn't graceful. Of course it may have gotten ugly if the second server configured the third - but firewall config actually prevented it (thankfully).

I've used this script several times for several clients and this is the first time I've run into it - must be because of the required reboot during prerequisite install - I don't recall that happening before now.