- Is your Windows IT organization lacking in "scripters"?
- Having a hard time determining how to encourage scripting without increasing risk to your infrastructure?
- Do you have a handful of "scripters" but no real overall plan on how to utilize it more?
- Do you believe that only rockstar IT talent can be trusted with scripting?
Often in Microsoft technology focused environments, scripting is seen as a last resort to accomplishing a project by management as there are typically too many unknowns and a large level of perceived risk. Management often also look at scripting as something that should only be done by IT talent with an existing skill set or proven track record. The traditional approach to any project in a Windows environment is to have interactive sessions with many different servers. This approach is very labor intensive. In most markets there is always pressure to keep headcounts a low as possible. This often leads to situations where large projects that would require a lot of labor hours are avoided until absolutely necessary or the issue is just worked around indefinitely. All this typically results in an never ending fire-drill kind of environment where the IT talent looks at job sites after every pressure soaked midnight conference call support session is resolved. This kind of environment is chillingly exposed in the book The Phoenix Project if you want to read deeper. Over the years remote desktop and some built-in and third-party remote management tools have made this easier but often these approach are not consistently used and non-standardized server environments create unpredictable results. In those environments scripting is often by a very small group of people that take it upon themselves to use scripting to reduce their personal workloads. This often leads to an environment with very little sharing of code and usually very different approaches to scripting. Often error checking, logging and other elements of a robust script are scrapped to meet project deadline or because those features are beyond the capability of the person doing the scripting. This results in a culture where all the risk and reward is placed on these individuals. Because of that scripting usage in the organization lives or dies based on the results of those individuals.
If any of this sounds familiar then I have helpful information for you. By focusing on Microsoft PowerShell as your primary scripting language and incorporating some tenants of professional software development, it is possible to reduce the risk of scripting projects and decrease the amount of time it task for each one. There is an added benefit of reducing the pressure on your IT talent. There are no magic solutions here but these approaches will move your organization closer to a script-first organization that is more agile. I have laid out two different tracks for this depending on the current level of scripting being done in your environment.
101 Track
For the organization that has no in house talent scripting and needs to build institutional trust around it
- Ensure there is centralized storage location for scripts
- This can be a file share location, team SharePoint site or any kind of shared folder
- All team members should have access
- Folder structure should be similar to follow
- Production_Scripts (this folder contains release versions of the scripts. If the IT team is large it be a good idea to have this section be read only for the overall team and only allow a few senior team members to have write access. This should prevent accidental changes to production scripts. Each sub-folder should be the name of the script and it should contain the script, documentation on how to use and any supporting files.)
- Users (These folders are named after the users on the team involved in creating scripts. This is where those users store scripts in process. I recomend that the specific user has Read-Write access and the entire time has Read-Only access. This allows for easier sharing of in-process code.)
- Gain Confidence with Scripts that Collect Information
- The fist step is to get familiar with how PowerShell works at the prompt with simple single line commands that collect information. At this point only test commands that make changes in your test/development environment.
- Choose a currently manual repetitious task that will require a lot of manual effort that can be reduced into a reasonable single line PowerShell command. For example if you need to query 500 servers to check for installed software to confirm some configuration setting.
- Confirm the PowerShell command in a test environment and manually on a small test group of production servers.
- Use Excel to generate a command per target (Blog coming soon on how to do this). For example if you want to gather configuration information from every Windows 2012 server in your environment you would start with a list of all the server names and a the tested single line PowerShell command to get the information you need. Then you use Excel to generate commands utilizing the names of each server. Then you can run the commands one at a time or batches. This prevents the need to handle looping in your script and builds confidence in your approach. Using loops in scripts is the first hard concept for beginners. Using Excel removes this as hurdle to utilizing scripting.
- Collecting data is usually an easy sell to management because the risk to the environment is very small if not zero.
- Ensure there is logging of all the results. To start this can be just capturing all output from the command prompt but utilizing objects and CSV outputs makes this much easier. It is important to do this for the data collection script as the same tactics will be used for the change configuration script latter
- Once confidence has increased around scripting in projects for data collection use the same approach for a project where changes need to be made to the environment. Provisoning of new elements in the environment is a good option. Here are some examples: creating new Active Directory (AD) Organizational Units, new AD user accounts, creating a predefined folder structure on a new file server or network storage device.
201 Track
For the organization that has utilized scripts to change production environments but wants to make the process more efficient.
- Look over all the elements of the 101 track and implement any not present in your environment.
- Define Scripting Style Guide and Best Practices for the Organization
- Define a standards guide & best practices for each language and types of scripts you want to utilize. All scripts to be used in the environment must then adhere to this guide.
- The style guide should be a living document which is owned by the team releasing the scripts. Any dispute should be handled by a vote on the team or a final management decision.
- It is important that sides have buy in on the end goal even if they may not agree on some of the details.
- This allows for a common ground in scripting style and approach which makes sharing code and code review easier.
- A common feeling among traditional Windows environment scripts is that they do scripting in their own style and it works well enough.
- Because of this there may be push back on moving toward a standardized approach.
- If this occurs it is important to reinforce that the goal is spread the effort of utilizing scripting across more of the IT team which in turn will spread out the work load and risk.
- Define a standards guide & best practices for each language and types of scripts you want to utilize. All scripts to be used in the environment must then adhere to this guide.
- Reuse Code between projects
- Often scripting in Windows environments is done in silos where each person has there own personal library of scripts with some sharing when dictated by a project or conditions. This approach must be avoided as it greatly increases the time required to generate new scripts and increases the potential risk as each person is writing their scripts a different way. Share code across teams by allowing read only access to the centralized script repository.
- Use a centralized source code control system
- Use SVN, CVS or Team Found server to have a central repository for all scripts
- Allow each scripting team member a personal storage location
- Run scripts from dedicated utility servers or workstations (physical or virtual)
- To reduce network latency the utility computers should be located physically near the servers. If there are multiple locations have a utility computer at each site can help create levels of fault tolerance in case of network failures.
- If possible replicate the exact configuration of the utility computers in your test environment by utilizing automated provisioning. The key element is to ensure that there is automated method of deploying completed scripts to the production environments.
- Allow the tool builders time to work and ensure the users of the tools know how to use them.
- If there only a handful of IT talent interested or able to write scripts be sure to give them time to do so. This may require re-arranging of duties on the time to start with. If code re-use is adopted fully the time to production should eventually reduce.
- The focus should be on building tools. You don't have to know how the script was written to be able to use. This is a key point often missed. Once a script has been tested in your production environment and how to use it documented there should be no issue allowing non-scripting team members to use it. This provides the greatest utility of the previously developed script.