Jay Harris is Cpt. LoadTest

a .net developers blog on improving user experience of humans and coders
Home | About | Speaking | Contact | Archives | RSS
 

Azure Websites are a fantastic method of hosting your own web site. At Arana Software, we use them often, particularly as test environments for our client projects. We can quickly spin up a free site that is constantly up-to-date with the latest code using continuous deployment from the project’s Git repository. Clients are able to see progress on our development efforts without us having to worry about synchronizing codebases or managing infrastructure. Since Windows Azure is a Microsoft offering, it is a natural for handling .NET projects, but JavaScript-based nodejs is also a natural fit and a first-class citizen on the Azure ecosystem.

Incorporating Grunt

Grunt is a JavaScript-based task runner for your development projects to help you with those repetitive, menial, and mundane tasks that are necessary for production readiness. This could be running unit tests, compiling code, image processing, bundling and minification, and more. For our production deployments, we commonly use Grunt to compile LESS into CSS, CoffeeScript into JavaScript, and Jade into HTML, taking the code we write and preparing it for browser consumption. We also use Grunt to optimize these various files for speed through bundling and minification. The output of this work is done at deployment rather than development, with only the source code committed into Git and never its optimized output.

Git Deploy and Kudu

Continuous deployment will automatically update your site with the latest source code whenever modifications are made to the source repository. This will also work with Mercurial. There is plenty of existing documentation on setting up Git Deploy in Azure, so consider that a prerequisite for this article. However, Git Deploy, alone, will only take the files as they are in source, and directly deploy them to the site. If you need to run additional tasks, such as compiling your .NET source or running Grunt, that is where Kudu comes in.

Kudu is the engine that drives Git deployments in Windows Azure. Untouched, it will simply synchronize files from Git to your /wwwroot, but it can be easily reconfigured to execute a deployment command, such as a Windows Command file, a Shell Script, or a nodejs script. This is enabled through a standardized file named ".deployment". For Grunt deployment, we are going to execute a Shell Script that will perform npm, Bower, and Grunt commands in an effort to make our code production-ready. For other options on .deployment, check out the Kudu project wiki.

Kudu is also available locally for testing, and to help build out your deployment scripts. The engine is available as a part of the cross-platform Windows Azure Command Line Tools, available through npm.

Installing the Azure CLI

npm install azure-cli –-global

We can also use the Azure CLI to generate default Kudu scripts for our nodejs project. Though we will need to make a few modifications to make the scripts work with Grunt, it will give us a good start.

azure site deploymentscript –-node

This command will generate both our <code>.deployment</code> and the default <code>deploy.sh</code>.

Our .deployment file

[config]
command = bash ./deploy.sh

Customizing deploy.sh for Grunt Deployment

From .deployment, Kudu will automatically execute our deploy.sh script. Kudu’s default deploy.sh for a nodejs project will establish the environment for node and npm as well as some supporting environment variables. It will also include a "# Deployment" section containing all of the deployment steps. By default, this will copy your repository contents to your /wwwroot, and then execute npm install --production against wwwroot, as if installing the application's operating dependencies. However, under Grunt, we want to execute tasks prior to /wwwroot deployment, such as executing our Grunt tasks to compile LESS into CSS and CoffeeScript into JavaScript. By replacing the entire Deployment section with the code below, we instruct Kudu to perform the following tasks:

  1. Get the latest changes from Git (or Hg). This is done automatically before running deploy.sh.
  2. Run npm install, installing all dependencies, including those necessary for development.
  3. Optionally run bower install, if bower.json exists. This will update our client-side JavaScript libraries.
  4. Optionally run grunt, if Gruntfile.js exists. Below, I have grunt configured to run the Clean, Common, and Dist tasks, which are LinemanJS's default tasks for constructing a production-ready build. You can update the script to run whichever tasks you need, or modify your Gruntfile to set these as the default tasks.
  5. Finally, sync the contents of the prepared /dist directory to /wwwroot. It is important to note that this is a KuduSync (similar to RSync), and not just a copy. We only need to update the files that changed, which includes removing any obsolete files.

Our deploy.sh file's Deployment Section

# Deployment
# ----------

echo Handling node.js grunt deployment.

# 1. Select node version
selectNodeVersion

# 2. Install npm packages
if [ -e "$DEPLOYMENT_SOURCE/package.json" ]; then
  eval $NPM_CMD install
  exitWithMessageOnError "npm failed"
fi

# 3. Install bower packages
if [ -e "$DEPLOYMENT_SOURCE/bower.json" ]; then
  eval $NPM_CMD install bower
  exitWithMessageOnError "installing bower failed"
  ./node_modules/.bin/bower install
  exitWithMessageOnError "bower failed"
fi

# 4. Run grunt
if [ -e "$DEPLOYMENT_SOURCE/Gruntfile.js" ]; then
  eval $NPM_CMD install grunt-cli
  exitWithMessageOnError "installing grunt failed"
  ./node_modules/.bin/grunt --no-color clean common dist
  exitWithMessageOnError "grunt failed"
fi

# 5. KuduSync to Target
"$KUDU_SYNC_CMD" -v 500 -f "$DEPLOYMENT_SOURCE/dist" -t "$DEPLOYMENT_TARGET" -n "$NEXT_MANIFEST_PATH" -p "$PREVIOUS_MANIFEST_PATH" -i ".git;.hg;.deployment;deploy.sh"
exitWithMessageOnError "Kudu Sync to Target failed"

These commands will execute bower and Grunt from local npm installations, rather than the global space, as Windows Azure does not allow easy access to global installations. Because bower and Grunt are manually installed based on the existence of bower.json or Gruntfile.js, they also are not required to be referenced in your own package.json. Finally, be sure to leave the –no-color flag enabled for Grunt execution, as the Azure Deployment Logs will stumble when processing the ANSI color codes that are common on Grunt output.

Assuming that Git Deployment has already been configured, committing these files in to Git will complete the process. Because the latest changes from Git are pulled before executing the deployment steps, these two new files (.deployment and deploy.sh) will be available when Kudu is ready for them.

Troubleshooting

Long Directory Paths and the 260-Character Path Limit

Though Azure does a fantastic job of hosting nodejs projects, at the end of the day Azure is still hosted on the Windows platform, and brings with it Windows limitations. One of the issues that you will quickly run into under node is the 260-Character Path Limitation. Under nodejs, the dependency tree for a node modules can get rather deep. And because each dependency module loads up its own dependency modules under its child folder structure, the folder structure can get rather deep, too. For example, Lineman requires Testem, which requires Winston, which requires Request; in the end, the directory tree can lead to ~/node_modules/lineman/node_modules/testem/node_modules/winston/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream, which combined with the root path structure, can far exceed the 260 limit.

The Workaround

To reduce this nesting, make some of these dependencies into first-level dependencies. With the nodejs dependency model, if a module has already been brought in at a higher level, it is not repeated in the chain. Thus, if Request is made as a direct dependency and listed in your project's project.json, it will no longer be nested under Winston, splitting this single dependency branch in two:

  1. ~/node_modules/lineman/node_modules/testem/node_modules/winston
  2. ~/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream

This is not ideal, but it will solve is a workaround for the Windows file structure limitations. The element that you must be careful of is with dependency versioning, as you will need to make sure your package.json references the appropriate version of your pseudo-dependency; in this case, make sure your package.json references the same version of Request as is referenced by Winston.

To help find those deep dependencies, use npm list. It will show you the full graph on the command line, supplying a handy visual indicator.

__dirname vs Process.cwd()

In the node ecosystem, Process.cwd() is the current working directory for the node process. There is also a common variable named __dirname that is created by node; its value is the directory that contained your node script. If you executed node against a script in the current working directory, then these values should be the same. Except when they aren't, like in Windows Azure.

In Windows Azure, everything is executed on the system drive, C:. Node and npm live here, and it appears as though your deployment space does as well. However, this deployment space is really a mapped directory, coming in from a network share where your files are persisted. In Azure's node ecosystem, this means that your Process.cwd() is the C-rooted path, while __dirname is the \\10.whatever-rooted UNC path to your persisted files. Some Grunt-based tools and plugins (including Lineman) will fail because that it will reference __dirname files while Grunt's core is attempting to run tasks with the scope of Process.cwd(); Grunt recognizes that it's trying to take action on \\10.whatever-rooted files in a C-rooted scope, and fails because the files are not in a child directory.

The Workaround

If you are encountering this issue, reconfigure Grunt to work in the \\10.whatever-rooted scope. You can do this by setting it's base path to __dirname, overriding the default Process.cwd(). Within your Gruntfile.js, set the base path immediately within your module export:

module.exports = function (grunt) {
  grunt.file.setBase(__dirname);
  // Code omitted
}

Unable to find environment variable LINEMAN_MAIN

If like me, you are using Lineman to build your applications, you will encounter this issue. Lineman manages Grunt and its configuration, so it prefers that all Grunt tasks are executed via the Lineman CLI rather than directly executed via the Grunt CLI. Lineman's Gruntfile.js includes a reference to an environment variable LINEMAN_MAIN, set by the Lineman CLI, so that Grunt will run under the context of the proper Lineman installation, which is what causes the failure if Grunt is executed directly.

The Fix (Because this isn't a hack)

Your development cycle has been configured to use lineman, so your deployment cycle should use it, too! Update your deploy.sh Grunt execution to run Lineman instead of Grunt. Also, since Lineman is referenced in your package.json, we don't need to install it; it is already there.

Option 1: deploy.sh

# 4. Run grunt
if [ -e "$DEPLOYMENT_SOURCE/Gruntfile.js" ]; then
  ./node_modules/.bin/lineman --no-color grunt clean common dist
  exitWithMessageOnError "lineman failed"
fi

Recommendation: Since Lineman is wrapping Grunt for all of its tasks, consider simplifying lineman grunt clean common dist into lineman clean build. You will still need the --no-color flag, so that Grunt will not use ANSI color codes.

The Alternate Workaround

If you don't want to change your deploy.sh—perhaps because you want to maintain the generic file to handle all things Grunt—then as an alternative you can update your Gruntfile.js to specify a default value for the missing LINEMAN_MAIN environment variable. This environment variable is just a string value passed in to node's require function, so that the right Lineman module can be loaded. Since Lineman is already included in your package.json, it will already be available in the local /node_modules folder because of the earlier npm install (deploy.sh, Step #2), and we can pass 'lineman' into require( ) to have Grunt load the local Lineman installation. Lineman will then supply its configuration into Grunt, and the system will proceed as if you executed Lineman directly.

Option 2: Gruntfile.js

module.exports = function(grunt) {
  grunt.file.setBase(__dirname);
  if (process.env['LINEMAN_MAIN'] === null || process.env['LINEMAN_MAIN'] === undefined) {
    process.env['LINEMAN_MAIN'] = 'lineman';
  }
  require(process.env['LINEMAN_MAIN']).config.grunt.run(grunt);
};

Credits

Thank you to @davidebbo, @guayan, @amitapl, and @dburton for helping troubleshoot Kudu and Grunt Deploy, making this all possible.

Changelog

2013-12-03: Updated LINEMAN_MAIN Troubleshooting to improve resolution. Rather than editing deploy.sh to set the environment variable, edit the file to execute Lineman. This is the proper (and more elegant) solution. [Credit: @searls]

Technorati Tags: ,,,,
Tuesday, 03 December 2013 00:34:25 (Eastern Standard Time, UTC-05:00)  #    Comments [4] - Trackback

Filed under: NAnt | Task Automation | Tools

Scott Hanselman posted an entry yesterday about Managing Multiple Configuration File Environments with Pre-build Events. His design uses pre-build events in Visual Studio to copy specific configuration files to the default file name, such as having "web.config.debug" and a "web.config.release" configuration files and the pre-build copying the appropriate file to "web.config" based on which build configuration you are in. This is a great idea, but large web.config files can get tedious to maintain, and there is a lot of repeated code. Even using include files as Scott suggests would help, but major blocks, such as Application Settings, may have to be repeated even though only the values change. This exposes human error, since an app setting may be forgotten or misspelled in one of your web.config versions.

At Latitude, we manage this problem through NAnt. One of our former developers, Erik Nelsestuen–brilliant guy–authored the original version of what we call "ConfigMerge". Essentially, our projects have no web.config under source control. Instead, we have a web.format.config. The format config is nearly identical to the web.config, except all of the application settings and connection strings have been replaced with NAnt property strings. Rather than have a seperate web.config for each environment and build configuration, we simply have NAnt property files. Our build events (as well as our automated build scripts) pass the location of the format file and the location of the property file and the output is a valid web.config, with the NAnt property strings replaced with their values from the environment property file.

It's simple. It only takes one NAnt COPY command.

default.build

<project default="configMerge">
  <property name="destinationfile"
    value="web.config" overwrite="false" />
  <property name="propertyfile"
    value="invalid.file" overwrite="false" />
  <property name="sourcefile"
    value="web.format.config" overwrite="false" />
 
  <include buildfile="${propertyfile}" failonerror="false"
    unless="${string::contains(propertyfile, 'invalid.file')}" />
 
  <target name="configMerge">
    <copy file="${sourcefile}"
        tofile="${destinationfile}" overwrite="true">
      <filterchain>
        <expandproperties />
      </filterchain>
    </copy>
  </target>
</project>

For an example, lets start with a partial web.config, just so you get the idea. I've stripped out most of the goo from a basic web.config, and am left with this:

web.confg

<configuration>
  <system.web>
    <compilation defaultLanguage="c#" debug="true" />
    <customErrors mode="RemoteOnly" /> 
  </system.web>
</configuration>

In a debug environment, we may want to enable debugging and turn off custom errors, but in release mode disable debugging and turn on RemoteOnly custom errors. The first thing we will need to do is create a format file, and then convert the values that we want to make dynamic into NAnt property strings.

web.format.config

<configuration>
  <system.web>
    <compilation defaultLanguage="c#" debug="${debugValue}" />
    <customErrors mode="${customErrorsValue}" /> 
  </system.web>
</configuration>

Next, we need to make NAnt property files, and add in values for each NAnt property that we've created. These property files will include the values to be injected into the web.config output. For the sake of simplicity, I always give my property files a '.property' file extension, but nant will accept any file name; you can use '.foo' if you like.

debugBuild.property

<project>
   <property name="debugValue" value="true" />
   <property name="configMergeValue" value="Off" />
</project>

releaseBuild.property

<project>
   <property name="debugValue" value="false" />
   <property name="configMergeValue" value="RemoteOnly" />
</project>

Finally, we just execute the NAnt script, passing in the appropriate source, destination and property file locations to produce our environment-specific web.config.

nant configMerge -D:sourcefile=web.format.config -D:propertyfile=debugBuild.property -D:destinationfile=web.config
web.config output

<configuration>
  <system.web>
    <compilation defaultLanguage="c#" debug="true" />
    <customErrors mode="Off" /> 
  </system.web>
</configuration>

And that's all there is to it. This is extendable further using the powers of NAnt. Using NAnt includes, you can put all of your base or default values in one property file that's referenced in your debugBuild.property or releaseBuild.property, further minimizing code. You can use Scott's pre-build event idea to have each build configuration have its own NAnt command to make mode-specific configuration files.

Feel free to use the above NAnt script however you like; but, as always YMMV. Use it at your own risk.

Enjoy.

Friday, 21 September 2007 22:00:26 (Eastern Daylight Time, UTC-04:00)  #    Comments [5] - Trackback

Filed under: NAnt | Task Automation | Tools

Both NAnt and NAntContrib released version 0.85 on Sunday. The changes to NAnt from 0.85 rc4 only include a few bug fixes. NAntContrib has added the ability to specify the encoding on SQL files. All-in-all, not much has changed since 0.85 rc4, but that is a good thing, since it indicates the version is finally ready for release. The first release candidate was made available nearly two years ago.

Despite the minimal changes in the final package, consider upgrading just to get rid of the ‘release candidate’ tag.

NAnt v0.85 [ homepage | download | release notes ]
NAntContrib v0.85 [ homepage | download | release notes ]

Tuesday, 17 October 2006 22:42:02 (Eastern Daylight Time, UTC-04:00)  #    Comments [0] - Trackback

NAnt hates .Net’s resource files, or .resx. Don’t get me wrong–it handles them just fine–but large quantities of resx will really bog it down.

Visual Studio loves resx. The IDE will automatically create a resource file for you when you open pages and controls in the ‘designer’ view. Back when we still used Visual SourceSafe as our SCM, Visual Studio happily checked the file in and forgot about it. Now, our 500+ page application has 500+ resource files. Most of these 500+ resource files contain zero resources, making them useless, pointless, and a detriment to the build.

This morning I went through the build log, noting every resx that contained zero resources, and deleted all of these useless files.

The compile time dropped by 5 minutes.

Moral of the story: Be weary of Visual Studio. With regards to resx, VS is a malware program that’s just filling your hard drive with junk. If you use resx, great, but if you don’t, delete them all. NAnt will love you for it.

Wednesday, 15 February 2006 11:31:31 (Eastern Standard Time, UTC-05:00)  #    Comments [0] - Trackback

I know. I haven’t posted in a while. But I’ve been crazy busy. Twelve hour days are my norm, right now. But enough complaining; let’s get to the good stuff.

By now you know my love for PsExec. I discovered it when trying to find a way to add assemblies to a remote GAC [post]. I’ve found more love for it. Now, I can remotely execute my performance tests!

Execute LoadRunner test using NAnt via LoadRunner:

<exec basedir="${P1}"
  program="psexec"
  failonerror="false"
  commandline='\${P2} /u ${P3} /p ${P4} /i /w "${P5}" cmd /c wlrun -Run
    -InvokeAnalysis -TestPath "${P6}" -ResultLocation "${P7}"
    -ResultCleanName "${P8}"' />

(I’ve created generic parameter names so that you can read it a little better.)
P1: Local directory for PsExec
P2: LoadRunner Controller Server name
P3: LoadRunner Controller Server user username. I use an Admin-level ID here, since this ID also needs rights to capture Windows PerfMon metrics on my app servers.
P4: LoadRunner Controller Server user password
P5: Working directory on P2 for 'wlrun.exe', such as C:\Program Files\Mercury\Mercury LoadRunner\bin
P6: Path on P2 to the LoadRunner scenario file
P7: Directory on P2 that contains all results from every test
P8: Result Set name for this test run

'-InvokeAnalysis' will automatically execute LoadRunner analysis at test completion. If you properly configure your Analysis default template, Analysis will automatically generate the result set you want, save the Analysis session information, and create a HTML report of the results. Now, put IIS on your Controller machine, and VDir to the main results directory in P7, and you will have access to the HTML report within minutes after your test completes.

Other ideas:

  • You can also hook it up to CruiseControl and have your CC.Net report include a link to the LR report.
  • Create a nightly build in CC.Net that will compile your code, deploy it to your performance testing environment, and execute the performance test. When you get to work in the morning, you have a link to your full performance test report waiting in your inbox.

The catch for all of this: you need a session logged in to the LoadRunner controller box at all times. The '/i' in the PsExec command means that it interacts with the desktop.

Sidenote

PsExec is my favorite tool right now. I can do so many cool things. I admit, as a domain administrator, I also get a little malicious, sometimes. The other day I used PsExec to start up solitaire on a co-workers box, then razzed him for playing games on the clock.

Friday, 14 October 2005 11:35:40 (Eastern Daylight Time, UTC-04:00)  #    Comments [0] - Trackback

With our new nightly database restore we now have the desire to automatically run all of the change scripts associated with a project. We’ve found a way; I created a NAnt script that will parse the Visual Studio Database Project (or "DBP") and execute all of the change scripts in it. Here’s how we got there.


Problem 1: Visual Studio Command Files are worthless

Our first idea was to have everyone update a command file in the DBP, and have NAnt run it every night. Visual Studio command files are great and all, but we have discovered a problem with them: they do not keep the files in order. We have named all of our folders (01 DDL, 02 DML, etc) and our change scripts (0001 Create MyTable.sql, 0002 AddInfoColumn to MyTable.sql) accordingly so that they should run in order. We have found that the command file feature of VS.Net 2003 does not keep them in order but rather seems to sort them first by extension, then by order, or some similar oddness. Obviously, if I try to at InfoColumn to MyTable before MyTable exists, I’m going to have a problem. So, the command file idea was axed.

Problem 2: Visual SourceSafe contents can’t be trusted

Our second idea was to VSSGET the DBP directory in VSS and execute every script in it. However, the VSS store cannot be trusted. If a developer creates a script in VS.Net called ‘0001 Crate MyTable.sql’ and checks it in to the project, then proceeds to correct the spelling error in VS.Net to ‘0001 Create MyTable.sql’, VS does not rename the old file in VSS. Instead, it removes the old file from the project, renames it locally, then adds the new name to the project and to VSS. It also never deletes the old file name from the VSS store. Now, both files (’0001 Crate MyTable.sql’ and ‘0001 Create MyTable.sql’) exist in VSS. Performing a VSSGET and executing all scripts will run both scripts, which could lead to more troubles.


So, we can’t use a command file, because it won’t maintain the order. We can’t trust VSS, since it can have obsolete files. We can only trust the project, but how do we get a list of files, ourselves?

Fortunately, DBP files are just text in a weird XML-wannabe format. The NAnt script will open the file and run through it looking for every ‘SCRIPT’ entry in the file. If it finds a ‘BEGIN something’ entry, it assumes that ’something’ is a folder name, and appends it to the working path until it finds ‘END’, at which time it returns to the parent directory.

It’s not perfect. It still runs in to some problems, but here it is in v0.1 form.

<project name="RunDBPScripts" default="RunScripts">
<!–-
Execute all scripts in a VS.Net DBP
Author: Jay Harris, http://www.cptloadtest.com, (c) 2005 Jason Harris
License: This work is licensed under a  
   Creative Commons Attribution 3.0 United States License.  
   http://creativecommons.org/licenses/by/3.0/us/ 

This script is offered as-is.
I am not responsible for any misfortunes that may arise from its use.
Use at your own risk.
-–>
<!-– Project: The path of the DBP file –->
<property name="project" value="Scripts.dbp" overwrite="false" />
<!-– Server: The machine name of the Database Server –->
<property name="server" value="localhost" overwrite="false" />
<!-– Database: The database that the scripts will be run against –->
<property name="database" value="Northwind" overwrite="false" />
<target name="RunScripts">
        <property name="currentpath"
            value="${directory::get-parent-directory(project)}" />
        <foreach item="Line" property="ProjectLineItem" in="${project}">
            <if test="${string::contains(ProjectLineItem, 'Begin Folder = ')}">
                <regex pattern="Folder = &quot;(?’ProjectFolder’.*)&quot;$"
                    input="${string::trim(ProjectLineItem)}" />
                <property name="currentpath"
                    value="${path::combine(currentpath, ProjectFolder)}" />
            </if>
            <if test="${string::contains(ProjectLineItem, 'Script = ')}">
                <regex pattern="Script = &quot;(?’ScriptName’.*)&quot;$"
                    input="${string::trim(ProjectLineItem)}" />
                <echo message="Executing Change Script (${server+"\"+database}): ${path::combine(currentpath, ScriptName)}" />
                <exec workingdir="${currentpath}" program="osql"
                    basedir="C:\Program Files\Microsoft SQL Server\80\Tools\Binn"
                    commandline=’-S ${server} -d ${database} -i “${ScriptName}" -n -E -b’ />
            </if>
            <if test="${string::trim(ProjectLineItem) == 'End’}">
                <property name="currentpath"
                    value="${directory::get-parent-directory(currentpath)}" />
            </if>
        </foreach>
    </target>
</project>

I used an <EXEC> NAnt task rather than <SQL>. I found that a lot of the scripts would not execute in the SQL task because of their design. VS Command Files use OSQL, so that’s what I used. I guess those command files were worth something after all.

If you know of a better way, or have any suggestions or comments, please let me know.

Thursday, 25 August 2005 12:15:41 (Eastern Daylight Time, UTC-04:00)  #    Comments [0] - Trackback

Filed under: Task Automation

With all that we stuff into the database on the QA environment, we need to perform a regular database restore. This way, we also get a fresh DB without any of the corruption from the previous day’s QA attacks.

I created a NAnt script to automate the process, including restoring security access when we restore from a backup created on a different machine. Centerting around the NAnt code below, my script disconnects all current connections to the database in question (we can not restore the DB without dropping it, and we can not drop it while connections are open), drops and restores the database, refreshes security, and performs a few other tasks such as setting all email addresses to internal addresses to prevent spamming the client and truncating the log since our server is a little short on disk space.

if exists (Select * from master.dbo.sysdatabases where name = '${database}')
Begin
    DROP DATABASE [${database}]
End
 
RESTORE DATABASE [${database}]
    FROM DISK = N'${backupfile}'
    WITH FILE = 1,
    NOUNLOAD ,
    STATS = 10,
    RECOVERY,
    – changes file locations from what was in the backup
    MOVE '${dataname}' TO '${path::combine(datadirectory,database+'.mdf')}',
    MOVE '${logname}' TO '${path::combine(logdirectory,database+'_Log.ldf')}'
Tuesday, 23 August 2005 12:09:03 (Eastern Daylight Time, UTC-04:00)  #    Comments [0] - Trackback

Filed under: NAnt | Task Automation | Tools

Hopefully this will save a few of you some time: I have created a registry entry that will create file associations and commands for your NAnt .build files. It will associate .build files as “NAnt Build Files” and create two commands for right-clicking a .build file in Explorer: “Edit” will open the file in Notepad; “Run” will execute the file in NAnt using a persistent command window (the window won’t disappear when the script is finished).

NAnt Build File Associations

Windows Registry Editor Version 5.00

[HKEY_CLASSES_ROOT\.build]
@=”build_auto_file”

[HKEY_CLASSES_ROOT\build_auto_file]
@=”NAnt Build File”
“EditFlags”=dword:00000000
“BrowserFlags”=dword:00000008

[HKEY_CLASSES_ROOT\build_auto_file\shell]
@=”Edit”

[HKEY_CLASSES_ROOT\build_auto_file\shell\&Run]
@=”Run”

[HKEY_CLASSES_ROOT\build_auto_file\shell\&Run\command]
@=”C:\WINDOWS\system32\CMD.EXE /k “C:\Program Files\NAnt\bin\NAnt.exe” -buildfile:%1″

[HKEY_CLASSES_ROOT\build_auto_file\shell\&Run\ddeexec]

[HKEY_CLASSES_ROOT\build_auto_file\shell\&Run\ddeexec\Application]
@=”NAnt”

[HKEY_CLASSES_ROOT\build_auto_file\shell\&Run\ddeexec\Topic]
@=”System”

[HKEY_CLASSES_ROOT\build_auto_file\shell\edit]
@=”&Edit”

[HKEY_CLASSES_ROOT\build_auto_file\shell\edit\command]
@=”C:\WINDOWS\system32\NOTEPAD.EXE %1″

[HKEY_CLASSES_ROOT\build_auto_file\shell\edit\ddeexec]

[HKEY_CLASSES_ROOT\build_auto_file\shell\edit\ddeexec\Application]
@=”NOTEPAD”

[HKEY_CLASSES_ROOT\build_auto_file\shell\edit\ddeexec\Topic]
@=”System”

Use this code/file at your own risk. I offer it as is, without any support. By downloading this file or using this code you take full responsibility for any repercussions that it may have on your computer.

Thursday, 11 August 2005 13:46:23 (Eastern Daylight Time, UTC-04:00)  #    Comments [0] - Trackback

Filed under: ASP.Net | Programming | Task Automation

The default settings of NUnit, TestRunner, and Test Driven Development all want different copies of the app.config at different locations. If ProjectName creates ProjectName.dll, then NUnit wants ProjectName.config, TR wants ProjectName.dll.config, and TDD wants TargetDir\ProjectName.dll.config. This is a lot of work to put in the post-build event of every unit test project, and can be even more work when another testing tool comes along that wants yet a new config filename. The best way to manage all of these file copies is through a common post-build event call.

Many probably opt for a NAnt script, but we found that passing in the required paths can sometimes cause NAnt to get confused, and it won’t properly parse the parameter listing. So, we went with a command file, instead.

CopyConfigs.cmd

rem for nunit

copy “%~1App.config” “%~1%~2.config”

 

rem for testrunner

copy “%~1App.config” “%~1%~2.dll.config”

 

rem for testdrivendevelopment

copy “%~1App.config” “%~3.config”

VS.Net Post Build Event

call “C:\MyPath\CopyConfigs.cmd” “$(ProjectDir)” “$(ProjectName) “$(TargetPath)”

VS.Net already includes a series of NAnt-like properties for project names, project directories, target [assembly] filenames, etc; these come in handy for creating a universal script. Placing the path references in quotes allows for spaces and other characters (Except more quotes) in the path. Executing the command file through a call allows us a little more versatility with the argument references (%~1 removes the surrounding quotes from the argument value, allowing us to append a few together without jacking the subsequent path).

Monday, 08 August 2005 13:48:20 (Eastern Daylight Time, UTC-04:00)  #    Comments [0] - Trackback