Imaging Workflows for OS X and When to Use Them

System Image Utility Icon by AppleKnowing all the different imaging workflows for OS X and when to use them can save you a lot of time and energy. Depending on your experience level, familiarity with the platform, and your willingness to try new things; some or most of these workflows might be new to you. Current methods of imaging range from monolithic imaging (of which most of us have done at one point or another) to thin imaging and user provisioned (which is more common with heavy scripters). In this post, we will take a dive into the pros and cons of each imaging method, and when they might make sense for your deployment.

Please note, that the examples below are not the only way that these imaging methods could be used. Some administrators use only one of the methods below and are very successful. They key to the best imaging workflow for you, is to figure out what is efficient and easy for your deployment. However, we always want to keep a lookout ahead to see what could save us time and energy in the future!

Imaging Workflows Covered:


Most Mac admins have at one point or another used a monolithic image. My first monolithic image was used when I took my old iBook, booted to an external drive and made a backup via DMG. I was doing this so that I could upgrade from 10GB of space to a whopping 20GB. To get my computer working again, I restored my data back to my new drive. While it might not be what we have in mind when imaging a hundred or a thousand computers, the idea is the same. Everything goes into the image, so we do not have to create multiple images or workflows for deployment. While some might argue that monolithic imaging is dead, I believe that it is still one of the more common methods used.

A common use for monolithic imaging is for large imaging projects. Ones which hundreds or thousands of Mac computers need to be imaged with a large and/or a highly customized package set. This is also typically seen with labs of computers in schools or shared environments.


  • Package for package quicker due to ASR copy
  • Easy to create via many tools
  • WYSIWYG, everything is configured just they way you wanted it, very little left to scripting or trial and error


  • Not very agile when software vendors are increasing their pace of updates
  • Time consuming if want to customize your different groups of computers
  • All software titles included aren’t all used by end users
  • Requires documentation of changes and settings to make sure each rebuild is the same

Package Based

Package Icon by AppleMost Mac administrators would consider package based imaging to be the second generation of imaging. The community looked at monolithic images and said, “We need to scale easier and be more agile when new software is released.” In doing so, we moved from building one very large base image into a base image only containing the operating system. Once the base operating system is copied, then other packages are deployed during imaging and startup to add software titles or configurations. Think of this method being similar to building an iTunes playlist. Pick a song from here, pick a song from there, and soon you have your playlist (or in this case how we are going to get an imaged computer)!

Common uses for package based imaging are the same as monolithic imaging. Most will use this method in place of monolithic to increase agility, but at the loss of some speed during imaging. Please note that migrating to package based imaging might not be faster package for package, but it will allow you to trim out some packages that you might not need (speeding up the process).


  • Quick to change packages or operating systems for imaging
  • Easy to keep up with updates to software
  • Can build multiple different workflows or imaging configurations without duplicating work
  • Most management tools allow packages to be used for other purposes like: patching, software deployment, or self service


  • Package for package slower than monolithic imaging due to ASR copy of only the base image or first package
  • Initial setup takes time to build each package separately
  • Some titles will require testing and modification due to how they are designed to install
  • Scripts might have to be run in order to configure specific settings

Compiled Package Based

Compliled Package IconThe biggest negative for package based imaging is speed. Since we might have five, ten, or twenty packages to install, it will take longer than just dropping down one DMG via ASR. This is where the idea of compiled package based images came from. This process is very much a hybrid of monolithic and package based imaging. Basically, you would create your package based workflow with base image, packages, scripts, and other things that you want. You would use that workflow for smaller imaging projects of ten or twenty computers. However, once a shipment of a hundred or a thousand computers came in, we would want to make it as fast as possible to get those computers setup and ready to go. To do this, we would image a machine with our package based image, copy the finished product (like we would with a monolithic image), and then use the new package to image the large quantity of computers. Some third party management tools, like the Casper Suite, simplify this workflow to just a click of a button.

The common use for compiled package based images are larger imaging projects. Since it requires you to setup package based imaging first, the additional work is only rewarded if you have several hundred computers to image. Please note: if you compile an image, then it is a snapshot in time. Any changes to the workflow will require you to recapture or rebuild the compiled image.


  • Best of both worlds, you get package based for quick changes and monolithic for larger projects
  • Can save lots of time on larger projects


  • Need to rebuild the compiled image each time you change a package in the workflow
  • Hard to troubleshoot issues due to recompiling

Thin Imaging

iMac with PackageThere is a lot of discussion in technical realms about what thin imaging means. Is it really imaging? Thin imaging, in my opinion, is taking a Mac out of the box and layering your configuration on top of it. Doing things like: adding software titles, running scripts, or binding it to a directory service. While we might not be erasing the hard drive, we are booting the computer via NetBoot, external drive, or target mode to add what we need on to it.

This method is great for computers that are fresh in the box. However, I can hear some folks wondering, “What will happen a year or two from now when that computer has totally gone haywire and needs to be wiped and reloaded?” Apple Internet Recovery Apple created an over the internet NetBoot for any Mac running 10.7 Lion or newer. While this isn’t as fast as having a “traditional” imaging workflow, imagine all the time saved by not having to rebuild base images. This workflow is good for admins that are not re-imaging computers often.

Common uses for thin imaging are for Mac admins who have very small deployments or little need for re-imaging often. I have also seen this be very successful for large deployments of new Macs. Year one would use this imaging technique and year two would use one of the other three above.


  • No need to create a base image (saving time and energy)
  • One of the quickest ways to get a computer up and running
  • Not duplicating work (the computer already comes with a working OS)


  • Re-imaging can take several hours if using Apple Internet Recovery
  • Additional work via scripting could be required
  • Not very scalable for organizations that need to do lots of re-imaging

User Provisioned

User IconThe last “imaging” workflow, really isn’t imaging at all. I know, this is suppose to be an imaging guide but I feel like it is important to cover this workflow. While it has it’s roots in iOS, we have seen features move between iOS and OS X. The two most important management examples of this are configuration profiles and the Device Enrollment Program (DEP). I won’t cover configuration profiles in this post, but DEP is certainly something we should plan on using in the future. In short, it allows IT to say, “Those devices belong to my organization and must do these tasks when they boot up.” Currently, we are limited to enrolling the Mac into MDM, deploying configuration profiles, and sending a few remote commands. If you have access to WWDC 2014 session videos (Managing Apple Devices), you will notice that this program will be getting more robust for OS X.

So now that we know a little of the history, how does it work? Give the computer to your end user. Don’t image, don’t break the seal, and don’t panic! Have them setup the computer themselves and enroll it into your management tool. If you are using DEP this will happen during the Setup Assistant. If you do not have DEP, third party management tools typically have a Over-The-Air enrollment option. Once they have enrolled, deploy software and settings automatically or use self service tools to allow end users to choose what they need.

Common uses of user provisioned workflows would be with organizations with lots of remote staff. This method of drop shipping computers directly to end users saves time and energy for IT. However, since this is a provisioning method rather than imaging; using Apple Internet Recovery will be something we would work into our re-imaging process.


  • Very little hands on work for IT, everything is workflow based
  • Users feel like it is their computer and are stewards of their tech
  • Great for organizations without central locations


  • Re-imaging can take several hours if using Apple Internet Recovery
  • Additional work via scripting could be required
  • Requires trust that end users will enroll their computers (if not using DEP)
  • DEP doesn’t have all the features to make this workflow as attractive as it could be

What’s your favorite imaging workflow? Please share in the comments below!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.