Git Cherry Picking Across Forked Repos and Empty Commits

Recently I found myself in a situation where I wanted to bring in a specific upstream commit into a forked repository. Although these repos share a common history, the two repos had diverged enough that it wasn’t a straight-forward cherry-pick between branches. Instead, with clones of the two repositories I managed to cherry-pick as follows:

git --git-dir=..//.git format-patch -k -1 --stdout  | git am -3 -k

To complicate things further, a few days later, I found myself wanting to do the same thing, however, this time a submodule and another file had diverged enough that the patch no longer applied correctly. To get around this I had to:

git --git-dir=..//.git format-patch -k -1 --stdout  | patch -p1 --merge

Manually fix any of the still broken changes, then create a new commit with the changes.

These two stack overflow questions helped to work both of these issues out: https://stackoverflow.com/a/9507417 and https://stackoverflow.com/a/49537226

Finally, I’ve also in recent months found myself wanting to create a completely empty commit to kick off a downstream build process… much like you may touch a file to change its timestamp. To do this you can simply run:

git commit --allow-empty -m "Redeploy"

Merging a git repository from upstream when rebase won’t work

I use a lot of open source software in my research and work.

In recent months I’ve been modifying the source code of some of open source repositories to better suit my needs and I’ve contributed a few small changes back to the DeepLearning4J and the Snacktory projects.

This morning I’m starting to work on a further patch for the DeepLearning4J repository and I needed to bring my local repository up to date before committing the change. However, at some point over the past few months the DeepLearning4J repository has been rebased and my fork of it will no longer merge.
The usual approach for fixing this is to use the command:

git rebase upstream/master

However, for me this produces an error:

git encountered an error while preparing the patches to replay
these revisions:

As a result, git cannot rebase them.

Despite trying on two different computers similar errors occurred.

As I didn’t want to delete my entire repository and create a whole new fork of the upstream master this is the approach I took to fix the problem:

Backup the current master into a new branch:

git checkout -b oldMasterBackupBranch
git push origin oldMasterBackupBranch

Switch back to the master branch and replace it with the upstream master

git checkout master
git remote add upstream url/to/upstream/repo
git fetch hard upstream master
git reset --hard upstream/master

Push the updated master my github fork

git push origin master --force

This StackOverflow question helped a lot in working out this problem: Clean up a fork and restart it from the upstream

Git logs and commits across multiple branches

Like any good computer scientist I use git for many research and personal projects. My primary use of git is for code backups rather than collaborating with others. However, in some of my recent work I’ve been sharing repositories with colleagues and students which has caused me to improve my git skills.

The following is some of the functionality I’ve only recently discovered that has been extremely helpful:

git cherry-pick commit-id-number

This command proved very useful when I recently forked a github repo and made some changes to the source code for the specific project I’m working on. I soon discovered a bug in the original repository that a number of users had reported. I was able to fix the bug in my fork, but as my fork had changes that I didn’t want to contribute back to the original repository I was able to use the cherry-pick command to bring across only the specific commit related to the bug fix.

git checkout --theirs conflicted_file.php

Merge conflicts suck. But sometimes despite trying to pull as often as possible they still occur and can full your code with ugly messes to clean up. I recently wanted to throw away my changes to a file and simply use the latest committed file. By using git checkout –theirs I was able to throw away all my changes and go for the file that had been committed and conflicted with my changes. Conversely, you can use –ours to replace the conflicted file in favour of local changes.

git shortlog

During the past few weeks the students in the course I’ve been teaching this semester have used git to collaborate on group projects. The git shortlog command produces a list of commits grouped by each author allowing you to quickly see the relative rate at which people are contributing commits to a repository.

git branch -a

When you clone a remote repository it pulls in all branches from the remote repository. However, if you just type git branch you won’t see this. The -a flag allows you to see everything.

git log --all

The same issue applies when you are trying to see the log across all commits across all branches, just using the standard git log command will only produce the log for the current branch. The -all flag allows you to see the log across all branches, combining this with the cherry-pick command is very useful when you want to bring across just one set of changes rather than merging a whole branch.

git log --all --stat --author="Tom"

Bringing this all together I’ve begun to regularly use the above command to see all commits by a single user across all branches. This has been a good way to measure students’ contributions to a group project (note: the author option is case sensitive).

Completely destroying all data on a Hard Drive

A few days ago while clearing through some old boxes of computer equipment I discovered an old hard drive. This drive had been removed from an old computer that had been disposed of. At the time of disposal I copied all information from the old computer its replacement and kept the old hard drive as a backup in case something went wrong.

Now more than five years later I no longer need the backup and want to dispose of the physical hard drive. But first, I want to ensure that the drive is completely clear of the old data. Connecting the drive to my current computer it can still mount and read the old drive and I can see all the old files on it. It’s good that the backup has lasted this long but to completely wipe the drive of all this old personal data is a little more complex than just selecting all the contents and pressing the delete button or doing a reformat under Windows.

Completely destroying all data on the drive is important. If the drive is not completely wiped (that is every single physical sector on the drive is written over) it is possible that someone using a few pieces of software could bring the old data back from the dead.

To completely nuke the drive I could pay for commercial software or take a hammer to the physical drive. But there is a free way to nuke the drive by using Ubuntu Linux and it’s quite simple to do:

  1. Boot Ubuntu (running from a live CD/USB should work too)
  2. Find the full device path of the drive you want to destroy by running at a terminal:
    sudo gparted

    If you don’t have gparted installed, then install it using

    sudo apt-get install gparted

    Then on the right hand side of the GUI window select from the drop down list of hard drives and check the partitions of each one to confirm the path of the device that you want to nuke is. For instance:

    /dev/sdN
  3. Shred the drive using the following command, replacing the path with the path you found in step 2.
    sudo shred –vzn 1 /dev/sdN

    This command will take a while to run. It will go over the entire drive writing random data into every single physical sector of the drive and then a second time writing zero into every single sector.

Once the command has finished your drive will be completely wiped. It can now be reformatted and reused without any worry about someone being able to resurrect the previously deleted data.

SafePrice, Avast and Sneaky Browser Plugins

For many years now I have been using Avast as my anti-virus on my Windows computers. For the majority of that time it has been simple to use and generally non-invasive. However, in the past few months that has dramatically changed.

The first big change has been avast prompting to update software updates for almost every installed application. While this may be helpful for the vast majority of people who do not regularly update their systems, I’ve grown to ignore random pop-ups that say my computer is out of date – because the vast majority of the time they are scams/ads themselves. Saying my system is at critical risk because I haven’t updated an application in the last 24 hours is overkill – especially when the application specific updater isn’t prompting for the update.

Today’s inappropriate interruption from Avast is much more annoying and down right unethical – especially as I did not authorise this behaviour in the application. Below is a screen shot of a produce page from a trustworthy and popular online store I visit on a regular basis.

Avast SafePrice Website Takeover
Avast SafePrice Website Takeover

Continue reading “SafePrice, Avast and Sneaky Browser Plugins”

Chinese Trojan Spam Virus Attacking Websites

Since installing Google Analytics I have been checking my webstats on a near daily basis. However, because of my lack of blogging over the last few weeks I have also been monitoring the stats less. Today I learnt my lesson that maybe I should maintain a daily watch. Over the last few days (yesterday in particular) there has been a dramatic spike in the number of visits to my site despite no new blog posts being added.

Looking at the data in more detail it appears a lot of traffic is being generated out of China by a site called qq829.com

Looking into this some more there is this thread on a lot of traffic appearing from China and on the Google Analytics forum.

Furthermore both HubPages and Symantic have information on the Trojan that is causing the problem.

At this stage it does not appear that my website has been infected with Malware or compromised in anyway, however, please ensure that your antivirus software is up to date as this particular Trojan could be costing you a lot of traffic and could potentially cause other problems.

Furthermore I have now blocked traffic originating from the qq829 website, other people are blocking all of China but at this stage I am not considering it.

If you are facing similar weird problems with bursts of traffic to your site you can block the qq829 website by adding these lines to your .htaccess file.

SetEnvIfNoCase Referer "^qq829" TOBLOCK=1
SetEnvIfNoCase Referer "^cnzz" TOBLOCK=1

<FilesMatch "(.*)">
Order Allow,Deny
Allow from all
Deny from env=TOBLOCK
</FilesMatch>

deny from 219.232.240.0/20
deny from 203.171.224.0/20

Ubuntu 10.04 Lucid Lynx First Impressions

Sometime in the next 24 hours Beta 1 of Ubuntu 10.04 Lucid Lynx will be released to the world. This version of Ubuntu is different from the previous few versions for two key reasons the first is that it is a long term support release and as such will be [hopefully] more stable and more complete than other versions over the past year. The second change is in the user interface with a step away from the established brown “human” theme to a new theme that looks very Mac OS like.

For the last two days I have been running the daily build of the AMD64 release candidate for 10.04 Beta 1. So far I am very impressed with it. For the past year I have been running 9.04 as the 9.10 release in October of last year broke support for my laptop’s wireless drivers and would cause frequent lock ups. I am pleased to report that those crashes are a thing of the past in 10.04.

The Good:

  • Fast boot. 9.04 was a massive improvement in boot time over 8.10 and I am surprised to see even more of an improvement in 10.04, from BIOS to logged in would be around 20 seconds.
  • Stable. Sometimes Beta and Test Releases of software are so buggy that they are not even able to be fully tested. So far I have hit a few minor problems but by far I am very impressed.
  • Smooth. The x64 version is very smooth at booting, opening and closing windows, applications, etc. The entire operating system runs quietly and quickly.

The Bad:

  • Crash errors that are almost as cryptic as Windows BSOD and illegal operations. I have had two programs crash and both times the crash errors are just strings of numbers or error codes with no meanings or descriptions. It is very hard to even supply information on a bug report when you have no idea what went wrong, one minute it was working the next it isn’t.

The Ugly:

  • Video Drivers. I am running an ATI Raedon HD Video card and there are no free or propriety video card drivers at the moment. This means that any 2d or 3d video rendering is done through MESA software rendering and is very ugly. I hope this will be sorted out in the final release (and the current bug where if you try to install the old fglrx library aptitude will try to remove ubuntu desktop).
  • Software Install. If you want to install Ubuntu (and community) released software this is a breeze through the Ubuntu Software Manager but the instant you want to install any other piece of software you will need to go through the whole process of getting the source code, resolving dependences, compiling through the terminal sorting out linking errors and a whole lot of other nasty mess.
  • User Experience. Despite the new version of Ubuntu looking very pretty and running very fast it still fails badly in terms of user experience for your average user. Ubuntu is meant to be linux for human beings but I am still finding it linux for those people who want linux to work and have some computing knowledge for how to fix things when they go wrong and also have a linux geek to really fix things when they completely corrupt. Until vendors start releasing fully stable and supported drivers for Linux and there is a software install process for third party applications that works nicely through a simple GUI and not old fashion command windows Ubuntu and Linux in general will continue to only attract nerds, geeks and people who like to break things. I like Ubuntu for its speed and ease of use in a office/development environment. But when I am at home on the weekend I live in Windows. Things just work in Windows – fonts render correctly, most software now plugs and plays correctly, most music and dvds will just play, software is simple to install etc. Now I do not want to start a paid vs free software argument but just because it is free should not mean you need a whole lot of computing knowledge to get your email every morning.

Getting USB Browser Mice to work in Vista

I have had this issue with a number of mice and a number of different computers now. Some older USB mice will not work when you plug them into Windows Vista. What happens is a dialog appears saying installing software and then fails saying unknown device.

The fix for this as I just found out this morning is quite simple:

  • Click on start
  • Right click on computer
  • Select properties
  • On the left side of the dialog that comes up select device manager
  • Scroll down the list of devices to the known device
  • Right click and select Update Driver Software
  • Select chose from a list of drivers
  • Select Human Interface Device
  • Select HID compliant mouse
  • Click okay and the mouse should now work

Simple. And Windows had the drivers to make it work all along! Sometimes Windows does some really simple things wrong and as a result is just so frustrating. It is a mouse it should just work!

Adventures in the land of building Google Chrome OS

Okay I have now been working through the process of building Google Chrome OS for a little more than 12 hours. My main desktop computer has been on all night trying to sort out the development build environment so the code can be compiled. It does not help that we went over our data cap a few weeks back and are stuck on 64k internet until mid next week this makes downloading the required files ultra slow.

The build instructions provided by Google so far are quite clear and straightforward to follow. However, they are not very detailed. There are no timings for each step of the process or information about what each step does. So far I have downloaded the full source code (270mb) at uni so I would not have the dial up speed internet problem. However in order to compile the code it is required a strict development be provided. As such the compiling script creates a debootstrap environment virtualizing a minimal Debian OS. While this is a cool feature designed to ensure every build remains consistent it is a pain that this is not explained before the start of the process because the amount of data required to set this up is a lot more than the entire source code for the operating system.

Because the process of building from scratch is so long there has been a build snapshot uploaded onto The Pirate Bay. This is a good idea and I have seen on a few blogs comments that Google should be releasing a nightly build snapshot of the compiled OS. While this takes away the fun of building from scratch it does make testing the OS a lot more accessible. It is something I hope Google implement soon.

Hopefully my next blog on the OS will be a little more positive and lot more further down the building track.

Welcome to the Future – Windows 7 Professional x64 RTM

Yesterday I managed to get my hands on a copy of Windows 7 Professional through the MSDN Academic Alliance Progamme at Uni.

To avoid messing around with my current Vista install I decided to remove my old 160GB IDE Hard Disk from my old computer and install it into my new system (which isn’t that new anymore), being just out of warrenty I was safe to open the box and put in the hard disk.

First problem, whoever designed the motherboard and case layout in my new system never designed it for people to add stuff into. The IDE socket on the motherboard was located directly below the hard disk install location in the case, so the cable had to twist super tightly to get out from under the hard disk and then plug on a 90 degree angle into it. The second problem was the heat sync on my processor is so large I couldn’t get the drive into the drive bay without having to losen it a little and then reset it. The third problem was cables, the system had all the cables nicely cable tied down, however they had been placed into position so well that you couldn’t get to the spare power cables, once I had cut away some of the cable ties the mess of cables the required a number of unpluggings and rewirings so I could get enough slack on all the cables to get everything plugged in. Because of all this a ten minute job turned into a hour and a half of frustration.

Once this was completed I booted back into Vista and partitioned the newly installed 160GB drive into a 120GB partition for Windows 7 and a 40GB partition for installing Ubuntu 9.10 later this month. Once this was set in went the Windows 7 DVD. The installation of Windows 7 took less than 30 minutes and was incredabily straight forward. Easily the simplest installation of Windows I have ever done.

On a whole Windows 7 can be summed up in one word. Smooth. It is what Vista should have been. There are only minor differences in UI between the two operating systems, but those differents make a big difference in user experience. Gone is the quick launch bar, instead you can have programs always in the task bar, even if they are not running. The names of programs have vanished replaced with large icons. The sidebar is gone, you can now put gadgets anywhere on your screen. Windows Media Centre also has support for Freeview, which is great, no messing about with codecs and Media Portal. Windows Aero and animations are incredabily fast and crisp. So far I am very impressed.

The chart below shows my system rating on Windows 7. The values have increased slighty from Vista. (Vista scores in brackets).

Component Details Subscore Base score
Processor AMD Phenom(tm) 9600 Quad-Core Processor 6.9 (5.9)
3.5
Determined by lowest subscore
Memory (RAM) 4.00 GB 5.9 (5.9)
Graphics ATI Radeon HD 3400 Series 3.5 (3.5)
Gaming graphics 1919 MB Total available graphics memory 5.1 (3.9)
Primary hard disk 87GB Free (112GB Total) 5.3 (5.9)
Windows 7 Professional

The key things to note regarding the different scores are:

Processor – Vista is only 32 bit, Windows 7 is 64

Memory – Vista is only 32 bit therefore only has access to 3GB of RAM, Windows 7 has access to the full 4GB

Graphics – Aero doesn’t seem to take advantage of crossfire, so my system is always limited here. It is not a big feature anyway so I typically ignore this.

Gaming Graphics – This is the score that matters much more. For some reason on Windows 7 the score is a lot higher than Vista. The first reason for this is Windows 7 is giving crossfire 512mb more memory than Vista. I can only guess the second reason is newer graphic card drivers in Windows 7.

Primary hard disk – The decrease in score here is caused by using a older IDE drive compared to my primary Vista hard disk being SATA.