Tuesday, 23 August 2011

Static Analysis with C/C++ and Rational Team Concert

To recap: As i have done with my previous posts, this one is also building upon the previous post.  As this chain of posts is getting quite large now, I think its a good time to list them again here:
  1. OSLC Development using C/C++ and Rational Team Concert (Part 1)
  2. Open Services for Lifecycle Collaboration (OSLC) (a short intro)
  3. OSLC Development using C/C++ and Rational Team Concert (Part 2)
  4. Automated builds using C/C++ and Rational Team Concert
In the last post, (4), I went over how to create an automated build with the Jazz build engine and Rational Team Concert. In this post we will enhance that build to include a static analysis of the code. To be honest, the sample code is not very interesting from this perspective but we will get some reports which is the important thing for this demonstration.

There are other static analysis tools out there that plug into eclipse including CODAN (include in CDT 8.0), cppcheck as well as commercial ones. You could actually use any of them in this scenario if you understand how to launch them from the command line and generate some interesting output. For the purposes of this exercise I will use Rational Software Analyzer (RSAR) 7.1 Enterprise Edition which is a plugin for eclipse 3.4. I am using it because it produces some nice output that I can attach to the build in two ways: 1) Through a link to the RSAR server and 2) I also have a package of results in XML format without needing to do much. 3) Comes with a 30 day trial so you can start with it right away.

RTC 3.0.1 only works with 3.5/Galileo, 3.6/Helios (I even got it to work on 3.7/Indigo) but not 3.4/Ganymede so we will need to install RTC 2.0.0.2 client into RSAR 7.1 to be able to open RTC projects (even though you will not be able to connect to the RTC 3.0.1 server).

Ok.. first we need to set up our environment.

-Default options are perfect

2. Download and Install RTC 2.0.0.6 into the SAME package group as RSAR 7.1

You may need to tweak the server.xml for RSAR 7.1 Enterprise server to change the SHUTDOWN port. For example in this file:

C:\Program Files \IBM\RSAREE\tomcat\conf\server.xml
Change:

<Server port="9005" shutdown="SHUTDOWN">

to:

<Server port="9006" shutdown="SHUTDOWN">

3. Start the RSAR Enterprise server.

Now we need to prepare our project:

1. Let create a rule file for the static analysis. Launch RSAR and select the same workspace you loaded the OSLC Consumer project into. Mine is C:\workspace for example.Also:
- You may need to shutdown the RTC 3.0.1 client if it is running. 
- If you have the RTC 2.0.0.2 client installed into RSAR properly you RSAR will error telling you that you cannot connect RTC 2.0.0.2 to RTC 3.0.1 server. That's ok, we dont care. Ignore it and Press "OK".
- If RSAR pops up large dialog complaining about workspace consistency.. also ignore it and select "Repair Later"

2. At this point you should see the OSLC Consumer Project.. if not switch to the C/C++ perspective.

3. Right click on the project->Software Analyzer->Software Analyzer Configurations...

4. Create a new configuration call "OSLC Consumer". Make sure the OSLC Consumer Project is in Scope.

5. Select all the C/C++ rules as below:

6. Now click the "Export" button and save the project into the OSLC Consumer Project directory as "OSLC Consumer.dat"

7. Click Close. (You could analyze here if you want but we will do it as part of our build.)

8. Close RSAR 7.1.

We are done it terms of configuration RSAR to analyze our code. If we wanted to change the rules or create different rules we would follow the same procedure as above. We need to check in the .dat file into RTC and modify our build to run a static analysis after compilation. After compilation makes more sense because if the code doesn't compile, its not worth to do a static analysis on it anyway.

1. Launch RTC 3.0.1 Client

2. You may again see the large dialog with the "Crash detected' message. This time you should click "Repair Now".

3.  We will create a new makefile for the JBE build that includes the static analysis
-We could also include this as part of our local build as it is a subset of the build file we will use here.

4. Download the new JBEMakefile, copy it into your OSLC Consumer Project directory and add it to the project and deliver it into the SCM.

Lets review the changes we have made here. You may need to tweak the JBEMakefile including:

CLASSPATH = C:/jazz/buildsystem/buildengine/eclipse/plugins/org.apache.ant_1.7.1.v20090120-1145/lib/ant-launcher.jar
JAZZ_TOOLKIT = C:/jazz/buildsystem/buildtoolkit
JAVA = C:\Progra~2\IBM\Java60\bin\java

You should adjust these according to your environment.  Also note that we could of made these build properties in the same way as "repositoryAddress" is but we will leave it live this for now and you can do it later.

5. Download the build.xml file and do the same as above (save to project and deliver).

You will also need to adjust this particular file to your environment. In particular:

<property name="userId" value="ADMIN" />
<property name="password" value="password" />

This should be the same as your build user name and password.

Search and replace:

C:\Progra~2\IBM\RSAREE

to point to your RSAR installation. Again, it would of been better to put this as a build property. I leave this as an exercise for you.

6. Go to your CDT Build definition created in the previous post and change the Command line as follows. Note the build proprieties. This is where you would set the other ones. Also note the make command is now using the JBEMakefile.

7. One last thing you need to do! You need to load the build workspace and accept the changes you have delivered. First unload your current workspace to prevent confusion.

You can load it the build workspace by issuing a load command on the workspace.
Now accept the changes in the pending changes view. That's it!

You can peruse the files at your will. The end result is that the Makefile will launch a static analysis using ant command to interact with the Jazz build engine so as to update the build result and notify on progress. This is done using ant tags. For information on those tags go to the CLM 3.0.1 information center.

The build result should look something like this,  with a link the RSAR server with the static analysis web pages and an attachment with the analysis results in XML format.
That's it for today.. Next time I will expand the build and add some unit tests!

Friday, 12 August 2011

Automated builds using C/C++ and Rational Team Concert

In the previous two posts (Part 1 , Part 2), I went through the process of setting up the basics to get started with C/C++ in Rational Team Concert. We were building and running the code from RTC menus and while this is fine for a one person code hobbyist it isn't really enough if we want to create quality code in a team production environment.  For that, we need to set up some automated tests and builds and in this post I will go through the steps on how to set up an automated C/C++ build in Rational Team Concert. We will be building the OSLC code sample that we imported in Part 2 for this exercise.

What you will need:

1. Rational Team Concert Client 3.0.1
2. Jazz Team Server with Change and Configuration Management (CLM 3.0.1)
3. Build System Toolkit 3.0.1
4. Ensure you have mingw32-make (see Part 1) is in your PATH and installed:
mingw-get install mingw32-make

I am going to assume that the Jazz Team Server 3.0.1 is installed with the change and configuration management application and that you have created a new project area with the out-of-the-box Scrum process called "CDT". If you do not know how to connect to a repository or create a project area I will refer you to the CLM 3.0.1 information center. Information on how to get set up, plus more, can be found there. If all went well, then your project area in the Team Artifacts View of the Work Item perspective should look like this:
IMPORTANT: Make sure you are assigned to the project as "Scrum Master"! Being project administrator is not enough.

We have a default stream and a default component and to keep things simple we will be using those for this exercise. Go the C/C++ perspective in RTC and deliver the OSLC Consumer project into the default stream and component.

Here's how to do it:

1. Right Click on the OSLC Consumer->Team->Share Project->Jazz Source Control

2. You will see that a workspace has already been created for you that is following the default stream. Select the CDT Stream Workspace's Default Component and then "Finish"

3. Now that we've added the project to the workspace we can simply deliver it to the stream by going into the Pending Changes View right clicking on the CDT Stream Workspace and selecting "Deliver"
NOTE: I will not associate a work item to the delivery at this point but we will do this later with another delivery.

4. Now our code is in the default stream which we will now set up for an automated build.

5. First thing we should do is set up a separate workspace for the build engine.  We could use our default workspace for this but as we are working on the code it would not be buildable and we also don't want the build engine to load over our work.
Go to the Team Artifacts View of in the Work Item perspective. Right click on "My Repository Workspaces". Select the default stream:
Click "Next" and call the new build workspace: "Build CDT Workspace". Click Next twice to get to the page asking for Read Access Permission. Select "Public".
NOTE: Usually you would want to make this private to the build engine but again for simplicity we will make it public.

Click "Next" again and ensure that the Default Component is selected and "Load repository workspace after creation" is unchecked.
Click "Finish".

6. Next is to create a build engine and a build definition. To create a build engine we will simply right click on the build engines section of our project area:
Click next and ensure we are creating a new build engine called "CDT engine" and that it is the "Jazz Build Engine" type. Click "Finish" and then click the "Save" button in the top right corner of the new build engine.

7. Now that we have an engine we need to define a build. Right click on "Builds" this time.
Select "New Build Definition...". Click "Next" to create a new build using the "Command Line - Jazz Build Engine" template. Leave the ID as "CDT build". Click Next and select "Jazz Source Control" in the Pre-Build page. Everything else will be default for now so click "Finish".

8. In the new build definition type "mingw32-make clean all" in the "Command" field of the "Command Line" tab and also set the working directory to be the location of the Makefile (we will add it to the project):

9. In the "Jazz Source Control" tab set the Build workspace to the one we just created for this build: "Build CDT Workspace" and select a Load directory in "Load Options" (In my case it is C:\build\CDT). Also Select "Delete directory before loading".
NOTE: A dialog will popup regarding the build owner. Press Ok.

 
Add the "CDT engine" in the "Supporting Build Engines" section of the "Overview" tab:

Click "Save" in the top right corner.

10. Now that we have defined a build engine and a build we need to set up the Jazz Build Engine to perform the build as we chose to use it for builds. Extract the Build System Toolkit 3.0.1 in a memorable location. (Mine is directly in the C: drive: C:\jazz).

11. Add the directory with the jbe comand line into the PATH environmental variable. (C:\jazz\buildsystem\buildengine\eclipse in my case).

12. Open up a command prompt and start the build engine:
jbe -repository https://<server name>:<server port>/ccm -userId <usermame> -pass <password> -engineId "CDT engine" -sleeptime 1

Replace the server name, port username and password with those that you use to connect to the CCM application on the Jazz server. If everything went well you should only see:

14. We are using a Makefile tool for our build so we need a Makefile in our project. We could get RTC to generate the build file for us but as we need to use it for our build engine we should keep it simple. For your convienience I have created one here.
Save the Makefile into the root of the "OSLC Consumer" project directory in your RTC workspace.  In RTC, right click the "OSLC Consumer" project in the project explorer and select "Refresh"

15. In order to test the make file we need to set the project builder to using mingw32-make. To do this we go to the project properties by right clicking on "OSLC Consumer". Go to C/C++ Build->Tool Chain Editor and select "Gnu Make Builder" For both the Release and Debug configuration:
Then select the "C/C++ Build" section and select "All Configurations" in the Configuration pull down. Use an External builder and make sure the Build command is mingw32-make and that the "Generate Makefiles automatically" option is off:

16. Now we are using the Makefile instead of the internal builder so lets try to build it by right clicking on the project and selecting "Build Project". It should build without errors.

17. Now lets check in the Makefile in the SCM. Right click on the Makefile and select Team->Check In and Deliver. A dialog will pop up and you can put a comment. Click "Next".

18. We are now going to create a work item so click the "Create Work Item" option. In the next screen you can fill in a task as in the screenshot:

Click "Finish". If it asks you if you want to check in changes then click "Yes".

17. That should be it! Lets request an automated build. Go back to RTC to the work item perspective and request a build by right clicking on the build and selecting "Request Build...".
If all goes well you should see a successful build in the builds view (you may need to refresh it a few times):
 
You can double click on the build result and inspect the build result record. You should see:
- The code changes in the build
- The build logs
- The workspace that was built
- 1 work item in the build (the one we created earlier.)
- A snapshot

This is quite a long blog so I will cut it short here.  In any case you can go into the build definition and schedule the build to run whenever you want.

Next time I will continue enhancing the feedback from the Makefile build.

Sunday, 7 August 2011

OSLC Development using C/C++ and Rational Team Concert (Part 2)

This is the second part in what i hope will be a series of posts on C/C++ development in Rational Team Concert (RTC).

In Part 1 I showed how to set up the CDT environment in RTC 3.0.1. If you haven't gone through the steps in Part 1 you should do so before doing this part. You should be able to build simple code now. 
In this part you will import an OSLC client written in C++ (see my post on OSLC). In order to do so we need to introduce some libraries into our environment to be able to make HTTP requests (libcurl) and parse the returning XML (libxml2).

You'll need the following:

(Latest as of publishing)
1. libcurl: curl-7.21.7-devel-mingw32.zip
2. libxml2: libxml2-2.7.8.win32.zip
3. libiconv: iconv-1.9.2.win32.zip
4. RTC OSLC C++ Project: OSLC Consumer.zip

Configure and Install:

1. First lets download some dependencies into prep our MinGW environment. Open up a command prompt and type the following commands:
mingw-get install zlib
mingw-get install libiconv
2. Unzip the curl-7.21.7-devel-mingw32.zip. Ensure that the contents of bin,include,lib, etc. are extracted into the corresponding directories in the MinGW home (usually C:\MinGW)

3. Now unzip the libxml2-2.7.8.win32.zip. Again, Ensure that the contents of bin,include,lib, etc. are extracted into the corresponding directories in the MinGW home (usually C:\MinGW). 
As libxml2 is a generic win32 package, we need to prep it from MinGW.  Open a command prompt and type:
cd C:\mingw\bin (or wherever you MinGW home is)
pexports libxml2.dll > libxml2.def
dlltool --dllname libxml2.dll --def libxml2.def --output-lib ../lib/libxml2.a

4. One more dependency is missing that is needed by libxml2. Open the iconv-1.9.2.win32.zip and extract bin\iconv.dll into C:\mingw\bin (or wherever you MinGW home is)

5. Now we are going to import the sample project into RTC.
-Launch RTC Client
-File->Import
-General->Existing Projects Into Workspace
-Click Next
-Select Archive File: Browse for the OSLC Consumer.zip file downloaded earlier
-Click Finish


You can to build and run the project at this point by right clicking on the newly imported project and selecting "Build Project". It should build without errors or warnings. Unless you have exactly the same host,port,username and password you will get some kind of connect error if you try to run it using the run button ().

If you take a look at the source code the first thing you will notice is that this is neither proper C nor C++ code. As this is just a code sample to convey the general idea, it was written up in a very simple linear fashion with no classes or many parameters.
In order for the code to connect to your Jazz Server you will need to change the hostname, port and username/password in the OSLCPAFetch.h file.
const std::string server = "https://localhost:9443/ccm"; <-your Jazz server
const std::string login = "username"; <-your user name
const std::string password = "password"; <- your password


This code demonstrates a few things:

1. A set up build environment that has extra libraries (take a look at the project properties for the linking, C/C++ Build->Settings->MinGW C++ Linker->General)
2. How to create OSLC requests (note the header parsing).
3. How to do form based authentication with Jazz
4. How to fetch Project areas using the xml2 XPath functionality.

If you need to have more in depth information about how this code functionality works I suggest to take a look at the OSLC workshop as this code sample is loosely based on it. Enjoy!


Next up (part 3)... creating an automated build environment for our new code using RTC SCM and automated builds!

Saturday, 6 August 2011

Open Services for Lifecycle Collaboration (OSLC) (a short intro)

Before I post the OSLC code sample, I thought I would provide a little bit of back ground on it.

The technology graveyard is full of past, and often short lived, attempts at providing a universal inter-technology communication platform. These brave attempts at inter-systems "peace", attempted to provide a standard inter-system communication regardless of implementation technology, operating system or even network protocol; a kind of IT Esperanto

One of the more popular attempts that actually had some degree of traction was CORBA. It allowed the creation of interfaces that were implemented by one system and called from another. Sounds good, except that this communication was facilitated by a middlemen called ORBs or "Brokers". These powerful middleman were platform specific translators that needed to be implemented on each technology that wanted to either call or provide interfaces. They were responsible for brokering requests in a standard way between each other and were often developed by different vendors in different technologies.

CORBA ultimately suffered the flaws of so many of its predecessors (and some successors) in their middlemen had constant interoperability problems. Even successors to CORBA that provided a toolkit or some kind of middle third party technology (traditional Web Services come to mind) to facilitate communication, had the same issues.

The logical conclusion to all of this would be to eliminate the middleman.  Like Esperanto, perhaps its better to decide on a common language that two should speak and eliminate the need for a translator who ultimately makes mistakes.  Fortunately because of the success of the Internet such a language is readily available, HTML, (or rather a more generic form called XML) as well as a common medium (HTTP).  This common paradigm of information exchange has been collectively wrapped up into a set of principles called RESTThe great thing about REST is that it uses an already proven mechanism and eliminates the need for a third party translator.

Now that systems have a common approach (REST) to be able to "talk" to each other, they now need to know and understand what to "say"; they have to find a common language.  At its most basic form it is XML, but usually for systems to "talk" to each other they need to establish a common vocabulary, a lexical structure with common terms that are understood by both parties.  Both systems may introduce new terms or words but a common agreement on the most basic language is usually required (developers who don't usually read the agreement are able to read the XML and and figure it, thats the great thing about it!).  One such effort in the domain of software engineering is Open Services for Lifecycle Collaboration (OSLC).



OSLC is a workgroup of industry leaders who are defining the vocabulary that should be used between components in an application product lifecycle: requirements management, quality management and change management, for example. It is also the integration technology that is adopted by the Jazz Platform for integration between its components: Requirements Composer, Quality Manager and Team Concert (as well as others!).  It allows loose coupling between components and it is possible to plug and play one component for another provided by another vendor. By creating an OSLC consumer or provider it wholly possible to integrate that organic requirements solution you developed in-house in the 90's into the Quality Manager and Team Concert.

We can take the 90's era organic legacy solution developed in C++ as a case in point. Perhaps the boss extols the virtues of the the in-house legacy requirements management solution with a passionate proclamation: "Legacy means it works!" . Now he wants your team to integrate it with Team Concert.

With that in mind, in the next blog I will demonstrate a C++ OSLC client in Rational Team concert.

Here are some excellent resources on OSLC:

OSLC Home Page: http://open-services.net/
Steve Speicher's OSLC Blog: http://stevespeicher.blogspot.com/
Open Services Wiki: http://open-services.net/bin/view/Main/WebHome
OSLC Workshop (Java OSLC client): https://jazz.net/library/article/635
Eclipse Lyo Project: http://www.eclipse.org/proposals/technology.lyo/