view.setSize(1920,1080); BufferedImage image = new BufferedImage(view.getWidth(), view.getHeight(), BufferedImage.TYPE_INT_RGB); view.paint(image.getGraphics()); ImageIO.write(image, "png", new File(args[1])); System.exit(0);
I ran that tool to create images for every single day, and then I made another short script to add dates to each of the images (ImageMagick works really well here):
DATE=$(echo $1 | cut -d'.' -f1) convert -fill "#aaa" -pointsize 50 label:"$DATE" /tmp/label.png composite -compose Multiply -gravity southwest /tmp/label.png $1 anno-$1
Now, with 312 images on hand, I decided to make them into a video:
mencoder mf://out/anno-*.png -mf w=1920:h=1080 -ovc lavc -lavcopts vcodec=ffv1 -of avi -ofps 3 -o output.avi
I then converted the high-def, lossless AVI into an Ogg file, and produced the following animated video of historical code coverage:
Okay, so no sound yet for the animation—the encoding is painful enough that I don't want to try it out right now. I also didn't filter out any of the days where the tests failed early, so you will occasionally see flashes of red. The data also doesn't have recent stuff (I am holding off until I can figure out how to run mozmill tests and get JS code coverage). Anyways, enjoy!
8 comments:
Awesome! Super kudos! And we turn pretty green too!
Any chance of publishing the data and/or converting it to JS visualization/animation using protovis (which could then link to the coverage info for that file)? With the video scaled down it's hard to make out some of the file names and that would be one way to address it while adding extra features.
The untarred version of the source data alone comes out to 1.4 GB, which is a bit more space than I have available on my current webhosting options.
Eventually, I want to have a nice web-app which will allow you to be able to move to particular dates and also give evidence as to whose changes impacted each day.
I can provide anyone the raw source data upon request.
Excellent work Joshua - thanks for doing it. It's nice to see the green poping in !
As Andrew states it would be nice to see big checkings adding plenty of tests.
Sweet video -- and I do like the flashes of red. :-) The difference from start to finish is definitely notable.
It'd be interesting to see something like this for the entire Mozilla source code, although I'm sure your per-file method break down quite a bit on that many more files, unfortunately.
This stuff is goooooooooood!!!
Let's touchbase when your time permits so that we can port your work into Firefox coverage data visualizations.
Murali Nandigama
Source code for the visualization tool is available in an hg repository now. If you already have a batch of the lcov .info files, this will be all you need.
The build scripts are presently rather specific to what I have to do, considering there is a libpango incompatibility problem, the c-c/m-c sync annoyances, libthebes breaking gcov on 64-bit machines, and me not having root privs on the machines in question.
Yes I have the lcov info files and also have a bunch of scripts that takes the lcov generated html structures and creates a CSV file out of the code coverage data for each file.
Let me check your HG repo. Thanks
Wow, that rocks! It's really encouraging to see how much better we've gotten over time!
Post a Comment