Website analyticsWebsite analytics

ConfigMgr and back again

Automating life, one Bit at a time.

Archive for the ‘ Operating System Deployment (OSD) ’ Category

Driver Import Issue

no comment

I’ve just run into an issue importing drivers for Windows 8.1 (on two Lenovo devices – Helix Gen2 and the ThinkPad Tablet 10) on SCCM 2012 R2 (recently upgraded from 2012 SP1 but this could be irrelevant).

The problem was that the primary site that I was trying to run the Import on was running Windows Server 2008 R2. When trying to import Windows 8.1 drivers, the following error would be displayed in the import wizard dialogue box and in the DriverCatalog.log:

The selected driver is not applicable to any supported platforms. Code 0×80070661

The problem appears to be due to driver signing and the method in which some of the newer drivers are signed. Windows Server 2008 R2 doesn’t recognise the signing method and rejects the drivers.

A KB article (KB3025419) was created to address this issue -

Which ultimately points to 2 other KB patches to install.

After installing these patches, and rebooting the server, all drivers now import successfully.

Create task sequence media larger than 32GB

no comment

I recently ran into a problem trying to create stand-alone media onto a USB key using ConfigMgr 2012 SP1. The task sequence contained driver packages, applications, operating system images and other content that totalled more than 32GB. Due to a combination of the ConfigMgr console Operating System, ConfigMgr itself, the USB disk formatting method/type and the USB stick size, doing this without prior tweaking is not possible. Windows 7/8,Server 2008/08R2/2012 isn’t able to cope with a partition greater than 32GB in conjunction with the ConfigMgr bootable media creation process.

This is what I needed to do to create a 64GB bootable USB stick.

  • Make sure the USB drive in question is removable media and not detected as a portable hard drive, or it won’t be seen by the console (USB stick, USB key required, not USB hard drive).
  • Use a Windows 7, 8, Server 2008/R2 or 2012 machine to format the USB stick with NTFS.
  • Find an old Windows XP machine (yes, they do still have a use) and install the ConfigMgr 2012 console on it.

Windows XP is able to work with drives >32GB, but if you try running the task sequence media wizard at this point, it will fail with the following error:

Query for Win32_Volumes failed. 80041010. Failed to determine whether or not <Drive> is the boot or system volume. Failed to create media (0×80041010).


This is because Windows XP does not have the Win32_Volume class that is used to query available drives.

We therefore need to add the class.

  •  Copy the following MOF text into a new file called config.mof and store it locally on the XP machine with the console installed (e.g. C:\config.mof)

#pragma namespace (“\\\\.\\root\\cimv2″)

Class Win32_Volume
String Name;
Boolean BootVolume;
Boolean SystemVolume;

Instance of Win32_Volume

  • Replace the red text above (E:) with the drive letter of your USB drive and save the mof file – note: keep the double backslashes.
  • It may also be necessary to replace the quotes around the text in red and the text on line 1, as wordpress converts them into incompatible speech-marks.
  • Open command prompt and run the following command:

mofcomp.exe <pathToMOF>\Config.mof

e.g.  mofcomp.exe C:\Config.mof

  • Now run the task sequence media wizard and select Stand-alone media.
  • Select USB flash drive and select your drive.

If the following warning comes up, just ignore it, the process will still work fine:

The USB flash drive is not formatted correctly. Format the USB flash drive from a Windows Vista or later operating system.


  •  Complete the wizard with your desired options.

The process will take a while to complete but you’ll have a nice USB stick >32GB at the end.
Be sure to monitor CreateTsMedia.log in the console install directory (e.g. .\Program Files\Configuration Manager Console\AdminUILog   or .\Program Files (x86)\Microsoft Configuration Manager\AdminConsole\AdminUILog) for progress.


It may be necessary to make the USB drive bootable at the end of the process. To do this, just run the following command in an elevated command shell:

bootsect.exe /nt60 E:\ /force /mbr

  • Replace the red text above (E:) with the drive letter of your USB drive.

Note: If bootsect.exe cannot be found in it’s default location (%WinDir%\System32), you can locate it in the Windows 7/8 source media .\boot folder.


OSD Application Migration without displaying UDI Wizard

no comment

I’ve been using USMT via OSD to migrate user profiles and settings using a zero-touch approach with no user interaction at all. This was working fine but I thought it would be nice if as well as user data, it also migrated applications. The UDI wizard discovery and selection page is a nice feature but I wanted it to be zero-touch, so I decided I’d incorporate the same discovery and selection but without displaying the wizard. This way, I could still use the same rules and mappings and the UDI wizard designer UI to control the migration behaviour, but without displaying the wizard to the user.

The way this was accomplished was pretty easy, it just required 3 new task sequence steps in place of the UDI Wizard call and a custom vb script.


The first step calls the AppDiscovery executable just like the UDI wizard does, and passes it the required parameters.


Command Line:  AppDiscovery.exe /readcfg:”%scriptroot%\” /writecfg:”%temp%\” /log:”%temp%\AppDiscovery.log”
Where /readcfg points to the location of the Configuration XML from UDI, /writecfg points to the location to write the discovery XML and /log points to the location to create the log file.
Start In: %deployroot%\tools\osdresults

The second step runs the custom vbscript which reads the XML file generated by AppDiscovery.exe, finds the applications that were detected and selected, and then sets them to their respective task sequence variables.


Command Line:  cscript.exe “%ScriptRoot%\Custom\SetAppVariables.vbs” “%temp%\” 64
Where ‘Custom’ is the name of a custom folder within the MDT Files Package ‘Scripts’ folder, is the same as the /writecfg switch in the ‘Run AppDiscovery’ step above, and 64 is the architecture of the Operating System being deployed.

The third step runs one of the built-in UDI scripts to serialize the XML file which sets the task sequence variable ‘ApplicationList’ from the XML file contents.


Command Line: cscript.exe “%deployroot%\tools\osdresults\OSD_SerializeXmlApp.vbs” > “%temp%\SerializeXmlApp.log”

A link to the vbscript can be found Here (in .txt format to allow upload - save or rename to .vbs)



Package References in Task Sequences

no comment

MDT Location-based Deployments for Software Installation based on Gateway address is a fantastic way to perform region-specific installs. However, if you need to specify more criteria (for example; architecture for x86/x64 specific applications or machine type such as applications that should only be installed on laptops or desktops) then this is where MDT falls down and you have to build application installs into the OSD task sequences themselves.

This is fine, it’s very easy to add application installs into the task sequence and there’s very powerful logical conditions that can be built to perform all necessary filtering required. The problem is that all packages referenced anywhere in the task sequence need to be available on all Distribution Points for any location that wishes to use OSD.

Even if you have a WMI filter such as: SELECT ClientSiteName FROM Win32_NTDomain WHERE Description = ‘DOMAIN’ AND ClientSiteName LIKE ‘%AD_SITE_NAME%’, as this query is not performed until runtime when directly accessed from the running system, the package is still required on the DP before even being allowed to launch the TS.

It is therefore a necessity to ensure all referenced packages are available on all DPs, and this can be a challenge to keep up to date with, especially if there are frequent updates to packages, or modifications to the task sequence, or new sites being added. The easiest way I’ve found to keep on top of package references on all DPs is to use a query (which could be made into a report) which will show the number of distribution points that contain all referenced packages for a specified task sequence. This is shown below.

SELECT derPackageTbl.ReferencePackageID AS [Package ID], derPackageTbl.Name, COUNT(dbo.PkgStatus.ID) AS [DP Count]
  (SELECT DISTINCT TOP (100) PERCENT tsr.ReferencePackageID,
‘[‘ + tsr.[ReferenceName] + ‘] ‘ + tsr.[ReferenceProgramName] +  ‘ [‘ + tsr.[ReferenceVersion] + ‘]’ AS [Name]
FROM dbo.v_TaskSequenceReferencesInfo AS tsr INNER JOIN
dbo.v_TaskSequencePackage AS tsp ON tsr.PackageID = tsp.PackageID
   WHERE (tsp.Name = ‘Task_Sequence_Name’)
   ORDER BY tsr.ReferencePackageID) AS derPackageTbl LEFT OUTER JOIN
dbo.PkgStatus ON derPackageTbl.ReferencePackageID = dbo.PkgStatus.ID
WHERE (dbo.PkgStatus.PkgServer NOT LIKE %display=%)
GROUP BY derPackageTbl.ReferencePackageID, derPackageTbl.Name

This generates output such as the following:

Package Reference Output

It is thereby evident that two packages are missing off one and two distribution points respectively. We would then go into that package and add the missing DPs.

OSD with USMT password to Rebuild

no comment

In a New Build scenario, where a machine is PXE booted to initiate an OS deployment, there’s an option in the PXE service point properties on each PXE-enabled site system to specify a password required for computers to boot using PXE. This is very convenient as it enables us to advertise the New Build task sequence to all machines, put in a check to ensure that the TS is running in WinPE using the _SMSTSinWinPE = True task sequence variable and this prevents (a) users being able to kick off a full rebuild from within windows and (b) users being able to PXE boot their own machine and kick it off without the password.

What happens however, if you want to be able to advertise a Refresh scenario build to all machines so that a machine can be upgraded or rebuilt using a USMT task sequence without having to add individual machines into a collection on an ad-hoc basis. Luckily the solution was very simple.

I wanted to replicate the password functionality of PXE boot builds, so I created a very simple 6-lines of code program to pop up a password box. If the correct password was entered, it would exit with code 0, if an incorrect password was entered, or the user closed the application using the X, it would exit with code 1. I packaged this up and sent it out to my DPs.

I then went to my task sequence, right-clicked and went to Properties, then navigated to the ‘Advanced’ tab. There is then an option to Run another program first, just like in package/program chains. I selected my Password prompt application as below:

Task Sequence Advanced Tab Image

I could then advertise my task sequence to any machine I wanted, and when I ran the advert, I would be presented with the following dialogue box:

Rebuild Password Image

If I then enter the incorrect password or close using the X, it fails with exit code 1, which in turn fails the parent program (the task sequence). If I enter the correct password, it returns code 0 and continues to execute the rebuild process.

The good thing about doing it this way is that I can also advertise the same task sequence to my New Builds collection. When machines PXE boot, they just receive the task sequence and not the pre-TS password application, so they can boot as normal, but still get the password box prompted by the PXE Service point. I now have a single architecture task sequence that installs 64-bit Windows 7 on machines with an x64 capable processor, 32-bit Windows 7 on those without, that will run a new-build scenario when PXE booted, and a refresh with USMT and Package Mapping scenario when run within Windows, after entering the correct password. All thanks to just 6 lines of code.

PackageMapping – Populating Relationships

no comment

When creating relationships between Add/Remove programs display names and Package ID/Program names, it occurred to me that manually modifying the database is tedious and time-consuming. Even creating procedures to import directly from the ARP table in the CCM database is time-consuming, removing the junk items and generally tidying up the clutter.

I therefore decided to spend 15 minutes putting together a really simple front-end to manage the links. It allows the creation of mappings based on a package/program (i.e. to map the same program to multiple ARP names), which is much easier to manage than creating individual ARP display name values and then having to input the same package ID and program for each one. It’s also much easier to see what you already have and gives the ability to search for existing packages or mappings to modify, delete or add new ones.

MDT DB App Image

MDT DB App Image 2

I have made this available via the Downloads section of this site [Link], there are however a few things to bear in mind:

  1. I am not a developer, therefore what I’ve put together is probably poorly coded and full of bugs, but it seems to do the job.
  2. It requires a precise design for the PackageMapping table (shown below), including the addition of a ‘Comments’ field which isn’t there as standard.
  3. It requires .Net Framework 3.5 client on the machine that’s running it, but a database server/instance and database can be specified for remote execution.
  4. The user running the program will require Connect, Select, Update, Insert and Delete permissions on the MDT Database to perform all functions.
  5. Always make a full backup of the database before using any third-party tools on it, especially mine.
  6. I provide the tool free of charge in the case that others can benefit from it, but cannot be held responsible for any loss of data that results from its use.

PackageMapping table design required:

  • ARPName [Primary Key] : nvarchar(255) : Allow Nulls – False
  • Packages : nvarchar(255) : Allow Nulls – True
  • Comments : nvarchar(MAX) : Allow Nulls – True

Package Mapping Design Image

PackageMapping Configuration

no comment

There are numerous guides showing how to configure MDT Package Mapping (otherwise known as Application Mapping or Application Migration), including an MSDN article (Link). That is not the aim of this post. I’ve just configured Package Mapping for use with OSD and came across some obstacles along the way. The aim of this post is to outline those obstacles and the steps taken to overcome them.

First of all I created the RetrievePackages stored procedure as per the MSDN article above. I, as lots have done before, decided to modify it slightly to use Add/Remove Programs display name rather than Product Code for simplicity and manageability. My stored procedure therefore looked like this:

/****** Object: StoredProcedure [dbo].[RetrievePackages] ******/
@MacAddress CHAR(17)
/* Select and return all the appropriate records
based on current inventory */

SELECT * FROM PackageMapping
WHERE a.ResourceID = n.ResourceID AND
MACAddress0 = @MacAddress

I then modified the CustomSettings.ini within my Settings Package to include the elements described in the MSDN article and sent it out to my distribution points. I populated the PackageMapping table with some test application display names and linked those to their equivalent package ID and program name. I ran the deployment process and of course, it failed.

There was no task sequence failure, and nothing was logged of interest in the SMSTS.log, the failure occured performing tasks against the MDT Database. The following was logged in the BDD.log when trying to execute the RetrievePackages procedure:

Error -2147217887 : Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.

This error didn’t really seem to indicate anything in particular to me, so I decided that, like 90% of other database issues, the problem was down to permissions. The connection was being attempted using the computer’s machine account, and so as this was a variable that would change on every machine that would be built, I granted Connect, Select and Execute permissions on the MDT database for ‘Domain Computers’. I kicked off the process again, and again it failed with the same problem.

The step where the information contained within the CustomSettings.ini is retrieved and the information queried from wherever the CustomSettings.ini data references (in this case, the MDT Database) is part of the MDT task sequence item set, ‘Gather’. Looking in the task sequence for all instances of Gather I notice that the only one before the point where the machine gets joined to the Domain, is a local only gather. There was another gather that uses the rules within CustomSettings.ini, but this was after the Join Domain step.  The point where the items retrieved from the CustomSettings.ini gets processed is the State Restore. I therefore needed another Gather between joining the Domain and performing the state restore. It was at this point that I decided to restructure it slightly to accomplish another requirement….

Package Mapping can’t be turned on or off on a per-machine basis. It’s either on for the task sequence, or off for the task sequence, and these changes involve modifying the CustomSettings.ini and sending the modified file out to all DPs. I decided I wanted to do something about this. I wanted the ability to have one Settings package that would stay on the DPs, but also have the ability to test Package Mapping before making it live, and have one task sequence that uses it, and one that does not, so that there’s a choice whether to Migrate Applications or not.

CustomSettings.ini Image

The permission changes I’d already made worked well and a connection to the database was succesfully performed, the procedure was successfully executed, the data was successfully retrieved, and the packages were installed as part of the ‘Install Multiple Applications’ step later on in the task sequence as additions to the MDT ‘PACKAGES’ base variable.

I then ended up with two task sequences, one that included the additional Gather to retrieve from the PackageMapping.ini that I could use to migrate applications (and to initially test the process), and one task sequence without this additional Gather which could be used to migrate user files and settings, but not applications.

PXE Service Point Failures

no comment

When performing a general infrastructure health check and reviewing the site status messages one day, I noticed that for a few sites, in the Component Status view, the SMS_PXE_SERVICE_POINT component was showing as Critical Status due to an Availability of ‘Failed’:

Component Status Image

I immediately contacted the office in question and was informed that PXE was functioning fine. I hopped onto the secondary site server in question and checked the Windows Deployment Services service and this was running. I opened up the SMSPXE.log from the client logs directory on the server and observed machines successfully contacting the PXE server and retrieving advertised task sequences.

I then opened up the pxecontrol.log from the server logs directory and the problem became evident. The log was reporting “PXE test request failed, status code is -2147467259 ‘ Error receiving replies from PXE server’”:

pxecontrol.log Image

The first IP address used to perform the PXE test was that of an ISCSI adapter. This obviously failed and then subsequent adapters failed. All our PXE Service Points are set to respond to requests on All Adapters. The server list used by the PXE service point to perform availability checks is populated with the list of addresses of all network adapters on the system, in the order defined in the Adapters & Bindings Connection list. I confirmed the ISCSI adapter was at the top of the connection list so I changed the priority so that the local NIC was at the top of the list and first in the priority order. This had the result shown below:

Adapter Order Image

pxecontrol.log After ImageThe change was detected in registry and applied. Then exactly 5 minutes after, the local addresses were added to the array in the revised order.

Upon performing the first test with the correct NIC, the request succeeded and no further test was necessary. Shortly after, this then updated the status in the ConfigMgr console; the component status went green and Availability showed as ‘Online’:

Component Status - After Image

Translate this page


Recent Posts


August 2017
« Jan