The New Mac Pro

Posted by mitch on June 11, 2013
hardware

I am very excited about the new Mac Pro.

We don’t know the price yet. We don’t have full specifications. It’s not clear this form factor will ever support dual CPU packages or 8 DIMM slots (it seems it might only have 4 sockets). The total price for four 32 GB DIMMs currently runs about $10,000 from B&H. Happily, four 16 GB DIMMs is a lot less—around $1,200. 64 GB of RAM is sufficient for me for now, but I am looking to see a 128 GB option for around $1,200 within two years of owning the machine based on how my need for memory has grown in the past.

Apple does claim an I/O throughput on flash to be around 1250 MB/s, which is better than my RAID 1+0 four disk SATA SSD RAID in my Mac Pro and faster than my first-generation PCIe Accelsior OWC card.

Apple mentions up to 2×6 GB of dedicated video RAM, which significantly beats the 1-3 GB cards we’ve had on the market until now. I also am excited at the prospect of 30″ displays at 3840 x 2160. My three Apple 30″ displays are starting to show their age in terms of the backlight wear—it takes longer and longer for them to come to full brightness. I bought a Dell 30″ for my other desk, and I had to buy a calibrator to get acceptable color out of it. So I am hopeful Apple will ship a matte 30″ 4K display… (this seems rather unlikely).

Only four USB ports is a shame, but not the end of the world. Hopefully the USB 3 hub issues with Macs will be resolved soon.

And then there are the PCI slots. My Mac Pro currently has a 7950 video card in one slot, an Areca 1680x, an eSATA card that I quit using, and the PCIe Accelsior. Frankly, the new Mac Pro meets my PCI expansion needs—external chasses are cheap if I ever really need slots (just $980 for a 3 slot Magma; and Apple mentions expansion chasses are supported). What makes this possible is that Thunderbolt RAIDs are just as fast as Areca SAS configurations and generally require a lot less monkeying around. I have two Promise 18 TB Thunderbolt RAIDs connected to a Mac Mini in my basement for Time Machine backups and they have been fantastic.

So I imagine my 2013 Mac Pro will look like the following configuration:

  • Mac Pro with 8 or 12 cores, depending on price and clock options
  • 64 GB of RAM
  • 512 GB — 1 TB flash storage for boot
  • Thunderbolt ports 1-3 — with DisplayPort adapters for existing displays
  • Thunderbolt port 4 — 12-24 TB Thunderbolt RAID for home directory. I’d love to see a 12×2.5″ SSD RAID 1+0 when 1 TB SSDs get under the $400 price point.
  • 3 USB ports connected to hubs
  • 1 USB port connected to external hard disk for cloning boot drive to
  • Hopefully the audio out line has an optical connection like the AirPort Express and other optical products.

I think this will fit my needs pretty well, as long as a 128 GB RAM upgrade is cheap enough down the line. 256 GB would have been a lot nicer.

And best of all, this configuration will free up at least 4 sq ft of floor space where my Mac Pro and SAS chassis sit. If the computer is quiet enough to sit on the desk, then both the Mac Pro and the Thunderbolt RAID only take up about 1.5 sq ft of room, which would be a tremendous improvement in my office where space is a premium.

Update: I take issue with the complainers who say that the new Mac Pro will lead to a big cable mess. For me, I expect it will be about the same, but take up less floor space:

Tags: ,

Sennheiser RS220 vs Sennheiser RS180

Posted by mitch on May 13, 2013
audio


Almost two years ago, I wrote a post comparing the Sennheiser RS180 headphones to my really old Sony IR wireless headphones. It was an easy post to write; the RS 180s were the best thing happening for wireless headphones at the time, as far as I know.

In March 2012, Sennheiser released their first wireless headphones that they segment in their audiophile line-up. Until that time, the audiophile models were the HD 518, 558, 598, 600, 650, 800 (and now, the new 700 model comes in between the 650 and 800 at $1,000). The new RS 220 model has been out for a year and has received positive reviews by media publications, but horrible reviews on Amazon and in forum discussions due to serious signal drop out issues. I didn’t buy them for a while, fearing those issues were real.

But in a moment of frustration with the RS 180s, I took the plunge (and Amazon has a good return policy). The drop outs were in fact real and serious. Thanks to a post over on Head-fi.org, I learned one fellow had changed his wifi network to use Channel 11, which I did as well–and mostly that has solved the drop outs for me.

So if you can solve the signal drops, how are these headphones?

They are fantastic–These are the best wireless headphones on the market. Do they have the same sound quality as my HD 800 rig? No, but at $600, they are a quarter of the price of my HD 800 set-up, weigh less, and have no wire to the headphones. The main frustration I had with the RS 180s ($280–$320 street price) is that piano and classical music are quite muddy in them. The 180s seem to be better suited for watching TV and listening to modern pop music than anything with fine detail–and for what they are good at, they are great. But the RS 220s are much better, with the drawbacks of shorter range, less battery life, and the darned wireless signal issues.

For me, the trade off is worth it as long as the wireless issues remain infrequent. There’s a lot that goes on in the 2.4 GHz range–WiFi, Bluetooth, cordless mice, microwave ovens–so I remain a bit apprehensive about it. After listening to the 220s, I can say that the 180s experience signal drops as well–they are more subtle and less irritating. The RS 180 signal drop is like a record skipping vs the RS 220 that feels like an empty second or two lapse on a cell phone.

Physically, the headphones are much more comfortable than the 180s. The padding is thicker, the headband isn’t as “crushing”. Beware that the headphones are open, meaning they are not for private listening. The other perk of the 220s is that the base has audio output, which let me get rid of a switchbox to pick headphones or my M-audio BX5 D2 speakers on my desk. I use a Belkin remote-controlled power strip to turn the M-audios on and off, so this has simplified my desk a little bit. I also like that the RS 220 base is easy to turn on/off with one hand–the 180 base is very lightweight and the buttons require a firm push.

I am using my 220s with an AudioEngine D1 DAC ($180). It probably doesn’t do the 220s justice, but it’s small and has a volume control on it, which is nice. I don’t feel I have enough room on my desk for a something much larger. I have 2 ft AudioQuest cables connecting the DAC to the RS 220 base, which seems fine. The 220 base also has optical input, but I like having the volume control on the AudioEngine unit, so I intend to keep using it, rather than connect the computer’s optical out directly to the 220 base.

Tags: ,

My journey to Sanebox

Posted by mitch on March 31, 2013
productivity

(If you just want to read why Sanebox rocks, scroll down.)

Remember Xobni? Originally they were a plug-in for Outlook that did a few things–when a message was viewed in the normal Outlook interface, the Xobni plug-in would show other emails you had exchanged with the sender and it would show files that had been shared with the other person via attachments. The Xobni plug-in showed social information about the other person–profile pictures from LinkedIn, number of connections, and so on. And finally, the Xobni plug-in enabled better search for email than Outlook had.

Adam Smith, the founding CEO, gave a talk at MIT and for years, one of the things he said has been stuck in my head: “If you can improve email just a little bit, then you can create a lot of value.”

Mr. Smith was absolutely right. Email is incredibly bad for how it is used in most workflows today. In 2010, Mr. Smith left day-to-day activities with Xobni, and I’m afraid the company lost their way, becoming a contact tool rather than an email tool somewhere along the way. That didn’t work out well for Plaxo and I am not convinced it will work for Xobni. For a few years now, there hasn’t been any innovation in email that was interesting to me.

But in the last few months a few things came to light:

mailbox-iconMailbox, recently acquired by Dropbox. Mailbox built an incredible amount of hype, had a great video that looked interesting, got accused of some pump and dump action with TechCrunch, and ran up to over a million users quite rapidly. I lost interest in the app within minutes of using it. It was clear it was not going to scale to the amount of email I had in my inbox. One feature I did particularly like about Mailbox was the concept of putting an alarm on an email (e.g., a reminder to re-triage this message in a day).

Then there is Mailstrom. I am not really sure what Mailstrom does, but it sends me weekly reports of what’s going on in my inbox. There’s a tool you can use if you remember to go to their site and want to tediously triage stuff. I don’t want to do that. The web site talks about getting to Inbox zero, but I will never pull it off with what they offer. The report is kind of cool though:

mailstrom-report

Finally, there’s Sanebox. Sanebox analyzes the patterns of to/from in your inbox and sent mail and automatically moves emails to a folder called @SaneLater if it doesn’t believe you will be replying to it. So all bulk email ends up in @SaneLater. This has made dealing with email a ton easier. Sanebox also puts emails older than about 8 or 9 months into a folder called @SaneArchive. I went from 30,000 emails in my personal Inbox to just 1,700 in my Inbox and 1,800 in @SaneLater. It is now much easier to see which emails require replies.

Sanebox offers a free trial that runs about 2 weeks. Towards the end, they convert customers with a genius piece of messaging (paraphrased): “Hey, if you don’t pay us, we’ll move all your email back the way it was!” Brilliant–I bought it.

Email still sucks. But Sanebox has made it suck a bit less. And the best part is that there is no additional user interface to use.

Tags: , , , ,

Shoddy Engineering Practices

Posted by mitch on March 26, 2013
software

Do you find yourself saying things that are so absurd you’d not believe it if you hadn’t lived through it? I’ve had a few of those experiences.

I once worked at a software company that had a strong unit test culture. There were specifications for everything and tests for everything. Sounds great, right? Unfortunately, despite extensive review meetings, the specifications never survived the few first days of coding. The unit tests were detailed and thorough–fantastic! But no one ever ran the whole product in-house. QA didn’t. Developers didn’t. The company didn’t use its own product because “it is too complicated.” After I left the company, I had a problem that the company’s product was designed to solve. As a shareholder, I was interested in the company having customers, and happy to be one. I downloaded the product and installed it. When I fired it up, the product crashed about 3 lines into main(). No one had ever used the product. It’s great to prove that the pieces work individually, but your customers will be using them when they are all bolted together. You have to make sure the holes line up.

I also once worked at a company with the exact opposite approach. There were no unit tests of any kind. No libraries, no modules, nothing. The only way to test anything was to run the product. This meant there were very few crashes on the third line of main(), but it also made it impossible to know whether or not any of the pieces of the code work independently. If you can determine whether pieces work independently, you can track those results, tie them to check-ins, and suddenly understand a heck of a lot about what changes in your source control impacted what areas of the product. You can track down not only data corruption but also which modules hit performance issues, whether within a unit test context or within an integration context within a customer environment.

The first company had a hard time getting customers to be successful with the product. The second company had difficulty predicting when a release would be ready. The former is easier to fix–start using the product internally and also fix the QA organization to include system tests. The latter likely entails an expensive refactor and reflects deeper cultural issues of a hacking organization rather than an engineering organization. Neither extreme is healthy.

Tags: , , , ,

Getting Back Into It

Posted by mitch on March 10, 2013
home

Over the last 25 weeks, I’ve been trying hard to do very little beyond sleep and goof off. Even though I don’t have a job and have been considering myself on vacation, looking back at my calendar, I still managed to do over 100 business-related meetings and spent 4 weeks on the road. Nevertheless, I’ve been relaxing–I bought a Playstation Vita last March and didn’t start playing it until January.

I’m starting to get going on a few projects and as I fiddle around with possible next directions, and come to realize that I don’t have enough desk space to juggle completely different activities that hit creative road blocks. I was at a stand-still most of last week because my desk was covered with a project that wasn’t going anywhere. I needed another desk to dump it on so I can still look at it, but engage on another project while it percolates. Since I’m doing some other remodeling this year, I started looking at reclaiming my closet (which is just a junk room) to pick up an extra 35 sq ft of “spreading out” surface area.

I’m also finding that I have too little in the way of filing space–despite moving most “archive” files into banker boxes and shredding about 60 gallons of files, I still lack sufficient filing space. So this configuration adds a second linear file cabinet.

Some other remodel considerations for the office:

1. Build a large built-in bookcase at the front of the office. In this configuration, it will hold most of my office-related books. Barely–which would be a big improvement over the current situation. Below is a rendering that my architect created of the bookcase. I also plan to reclaim space under the eave to put the stereo without it taking up floor space in the room where it currently resides.

2. Remove the chimney. I should have done this 5 years ago but felt like I had feature-creeped on the first remodel too much as it was. To finish a software release, you have to stop adding crap to the release and get things fixed. The same applies to a remodel.
3. Finally install a split AC system. I got a quote for this years ago, but could never get the fellow to come out and do the work.

Sadly, remodeling in real life isn’t as fast as a few hours of monkeying around in SketchUp… so until then, piles of stuff it is!

If you came to this post hoping to read about what I’m working on and you weren’t happy to hear about goofing off, come back in 25 weeks!

Tags: ,

Stop Hating on Bose?

Posted by mitch on February 14, 2013
audio

Sennheiser PXC-350 (left) and Bose QC-15 (right).

For years I’ve been fascinated by the hate against Bose products. Bose must be really bad to get all the negative reviews, right? Search any electronics forum or Amazon reviews, and you’ll find thousands of people frothing about how much Bose sucks.

In December 2009, I wanted a cheap pair of computer speakers for my office in California. I didn’t need anything fancy and I didn’t want a subwoofer. I went to Fry’s and the only 2-speaker system they had for a reasonable price was the Bose Companion 2 speakers for $100. Sighing, I bought them.

They weren’t super awesome. In fact, they were pretty muddy. I gave them a negative review on Amazon. However, they were $100 and small. At this point they sit in the closet; I have a pair of M-Audio BX5 D2s on my desk, which take up significantly more room and sacrifice a lot of usability. They are plugged into a cheap AudioEngines DAC/amp, which means the whole system cost four times the Bose Companion 2s. (Update: After I posted this, I remembered that when I moved the Bose Companion 2’s to my office in Boston, they sounded a lot better–the acoustics in my California office were crap, I suppose.)

Fast-forward to the middle of 2011, I decided to get rid of my stereo separates in the bedroom. My cleaners were always moving the speakers, disconnecting the speakers, and the whole system took up a lot of room. With some reluctance, I bought a Bose Wave radio/CD player–I couldn’t find anything that I liked the looks of better than the Bose at any price point. There are competing products for less money, but they look like crap. I wanted something that looked good.

It’s an expensive box–$500 for a radio, CD player, amp, and speakers. If you listen to the Wave within 2 feet of the unit, it is indeed “bass-y” and “boom-y.” But if you listen to it across the room, it sounds great! I really love my Bose Wave system.

Back in 2009, since I “knew” that Bose sucked, I bought the Sennheiser PXC-350 headphones for air travel. The modern model is the Sennheiser PXC-450, which run $350 on Amazon as of this writing. Recently I had misplaced the Sennheiser headphones and I bought the Bose QuietComfort 15 headphones from the local Apple Store.

I could not believe how good the Bose QuietComforts are. I suspect they have a bit of a low-pass filter in them–they are not accurate as, say, a pair of Sennheiser HD 800s. But I don’t care about accuracy for noise-canceling headphones! I don’t want to hear engine noise, fan noise, or people talking when I am wearing these. Without sound playing, the Bose headphones are dead silent in my office with a bit of desk fan noise. The Sennheiser PXC-350s pass a bit of that noise through and introduce some hiss that is often an artifact of cheaper noise-canceling headphones.

The cord on the Sennheiser ‘phones is much nicer and has a volume control. The Bose came with two cords, one without controls and one with an Apple remote. The Apple remote works fine with my iPhone 4S, but with my current-generation iPod Nano, it introduces feedback noise that is unacceptable. That’s a serious issue, either with my iPod or the headphones.

However, armed with better silence, smaller size, and lighter weight, the Bose headphones are a clear winner.

So if you’ve been avoiding Bose because you’ve heard they suck, maybe take another look. If you’re looking for accurate listening, you’ll note that I said above none of these Bose products produce accurate sound to my ear. Personally, I don’t need accurate listening for my bedroom, riding the train, flying in a plane, or to hear that Skype is ringing.

Some photos:

The Bose headphones are quite a bit smaller.

Comparing the cup size. The Bose headphones are tighter on the ear, but not to the point it is uncomfortable.

7.0 oz vs 10.0 oz

Despite being a Sennheiser fan, I can say that the Bose QC 15s are quite a better buy for the typical noise-canceling applications.

Tags: , , , ,

I hate pop quizzes from computers. Also, the UPS/FedEx landing page sucks.

Posted by mitch on October 31, 2012
software

How many times have you seen the UPS landing page below?

Why is it that neither FedEx nor UPS know about geolocation of IP addresses? Even if geolocation was used just as a hint or an obvious way was provided to change it if the geolocation went wrong, far fewer folks would hit these screens. And after you make a selection, the UPS site just sits there and does nothing until you click the small blue botton.

I hate software that makes me deal with pop quizzes and stupid modal dialogs. Hey, where are you? Would you like fast or small? Want to update now? Restart Firefox now? Hey, you will need to reboot after you install this, OK? Downloading this RPM will use 30 KB of disk space, sound good? Really quit? There’s an item on your calendar coming up in 10 minutes, you won’t be able to click anywhere in Gmail until you click this OK button… on all of your computers. I couldn’t copy that to the clipboard. Send a bug report to Apple? This document has ColorSync profile 1, and this other one has ColorSync profile 2. Would you like me to alert when submitting a form to an unencrypted site? There was an error with your request, try again later. All the downloadable content has downloaded. I am going to go ahead and join you to the unsecured network ‘linksys’. Hey, couldn’t backup this computer. Holy shit, there’s new software available, I don’t care if you’re watching a movie fullscreen (iTunes + Notification Center). Psst, mind if I phone home real quick? Oh, by the way, the iTunes terms and conditions have changed and I suddenly can’t remember your credit card number.

Who wouldn’t hate a machine that talks to you this way, all day, every day? I am trying to get some work done here, and you’re telling me all this out-of-band shit and very little of it is useful to me right now.

Please stop asking me questions. I don’t know what the answer is and you’re interrupting my train of thought, which I value far more than pondering any of the above questions.

Tags: , , , ,

Ethernet hwaddr and EEPROM storage with Arduino

Posted by mitch on October 31, 2012
hardware, projects, software

There are lots of examples of how to use the Ethernet Wiznet chips with Arduino, whether as Ethernet shields or as Ethernet Arduinos on a single board. Unfortunately, most of these examples hard-code the hardware (MAC) address, which can make things painful if you’re building more than one device and running them on the same network.

The code snippet below is a more convenient approach. You can setup a prefix (DEADBEEF in the example below) for the hardware address and the last two bytes are set randomly on first boot. The hardware address is stored in EEPROM (7 bytes are needed, 1 for a flag indicating that the next 6 bytes are properly populated).

The bytes->String conversion below is a bit ugly but I didn’t think I wanted the overhead of sprint in this. It is probably not worth the trade off. (0x30 is ‘0’ and 0x39 is ‘9’. Adding 0x07 skips over some ASCII characters to ‘A’.)

Some serious caveats: There’s only two bytes of randomness here. You might want more. Ideally you would have a manufacturing process, but if you’re just building six devices, who cares? Clearly you would never use this approach in a production environment, but it’s easier than changing the firmware for every device in a hobby environment. You could also use a separate program to write the EEPROM hardware address and keep this “manufacturing junk” out of your main firmware. These issues aside, my main requirement is convenience: I want to be able to burn a single image onto a new board and be up and running immediately without having to remember other steps. Convenience influences repeatability.

#include <Ethernet.h>
#include <EEPROM.h>

// This is a template address; the last two bytes will be randomly
// generated on the first boot and filled in.  On later boots, the
// bytes are pulled from EEPROM.
byte NETWORK_HW_ADDRESS[] = { 0xDE, 0xAD, 0xBE, 0xEF, 0x00, 0x00};
String NETWORK_HW_ADDRESS_STRING = "ERROR_NOT_FILLED_IN";

// These are commented out so that this code will not compile
// without the reader modifying these lines.  If you are using
// EEPROM code in your program already, you need to put the
// network address somewhere that doesn't collide with existing use.
//#define EEPROM_INIT_FLAG_ADDR 0
//#define EEPROM_HWADDR_START_ADDR 1

// Call this from your setup routine (see below)
void
initEthernetHardwareAddress() {
    int eeprom_flag = EEPROM.read(EEPROM_INIT_FLAG_ADDR);  
    int i;
    Serial.print("EEPROM flag is " + String(eeprom_flag));
    
    if (eeprom_flag != 0xCC) {
        NETWORK_HW_ADDRESS[4] = random(255);
        NETWORK_HW_ADDRESS[5] = random(255);  
        
        // write it out.
        Serial.println("Writing generated hwaddr to EEPROM...");
        for (i = 0; i < 6; i++) {
            EEPROM.write(EEPROM_HWADDR_START_ADDR + i + 1,
                         NETWORK_HW_ADDRESS[i]);
        }

        EEPROM.write(EEPROM_INIT_FLAG_ADDR, 0xCC);
    } else {
        Serial.print("Reading network hwaddr from EEPROM...");
        for (i = 0; i < 6; i++) {
            NETWORK_HW_ADDRESS[i] =
                EEPROM.read(EEPROM_HWADDR_START_ADDR + i + 1);
        }        
    }
    
    char hw_string[13];
    hw_string[12] = '\0';
    for (i = 0; i < 6; i++) {
        int j = i * 2;
        
        int the_byte    = NETWORK_HW_ADDRESS[i];
        int first_part  = (the_byte & 0xf0) >> 4;
        int second_part = (the_byte & 0x0f);
        
        first_part  += 0x30;
        second_part += 0x30;
        
        if (first_part > 0x39) {
            first_part += 0x07;
        }
        
        if (second_part > 0x39) {
            second_part += 0x07;
        }
        
        hw_string[j] = first_part;
        hw_string[j + 1] = second_part;
        
    }

    NETWORK_HW_ADDRESS_STRING = String(hw_string);

    Serial.println("NETWORK_ADDR = " + NETWORK_HW_ADDRESS_STRING);
}

void
setup() {
    // first call the usual Serial.begin and so forth...

    // setup the Ethernet hwaddr before you start using networking
    initEthernetHardwareAddress();

    int dhcp_worked = Ethernet.begin(NETWORK_HW_ADDRESS);

    // ...
}

Who Eliminated the Windows Advantage?

Posted by mitch on July 08, 2012
business

Much has been written about the new numbers on Apple’s accelerating market share against Windows in the last few weeks. The Business Insider article giving Apple all the credit for making this happen made me wonder–Is it really only Apple who should get credit?

Certainly Apple has done a few things that have enabled its position in the market beyond simple iPod/iPhone/iPad halo:

  1. Moving to Intel. This enabled fast virtualization to come to the Mac vs the old x86 emulation software or weird boards to add x86 processors to the Mac. This meant anyone could run Windows on a Mac for the cost of Windows + $50 for VMware Fusion, or dual boot if running OS X wasn’t in the cards. This reduced cost and risk for folks who wanted to make the switch. Moving to Intel also enabled some of the focused word on smaller machines, such as the Air. I can’t imagine IBM or Motorola spending the R&D dollars to develop PowerPC chips of sufficient caliber and thermal characteristics for a MacBook Air; they didn’t even have the business justification (or technology) for a PowerBook G5.
  2. Building the best hardware and doing it at a ridiculously good price. Remember when laptops like the 11” Air were premium products for executives and no one else? Now the 11” notebook is the 2nd cheapest Mac.
  3. Great marketing. The Mac vs PC commercials are accessible to anyone. Changing the game from a geeky-specification driven purchase to actual objectives or, as was the original vision, an appliance purchase.

But more broadly, applications have changed:

  1. Web-based email. Whether you use web-based email at work or not, many folks use web-based email at home. With Google Apps, lots of businesses can avoid Exchange mess. When my company was bought last year, I had to migrate from Google Apps to Outlook. It really sucked. I’m glad to be using Google Apps again.
  2. Web-based applications are big in business as well. Salesforce and plenty of other vendors provide serious apps for business. I worked at a Fortune 500 a dozen years ago where every engineer had a Sun workstation on their desk for minimally one reason: Access to the bug system. Can you imagine? A $25,000 piece of hardware just to use one proprietary tool that didn’t need Sun performance or really anything else that Sun was providing? Every engineer had a Windows system as well for Outlook and the requirements-tracking software too. It was an expensive operation.
  3. I can’t tell whether or not the Microsoft anti-trust settlement helped with some of the progress we’ve seen. Samba works with Active Directory, finally (though it didn’t until 2003); there’s many client and server implementations of Exchange (which is part of what enables Google Apps, Zimbra, perhaps also Apple Mail to play in an Exchange world?). Bruce Perens wasn’t excited back in 2002, but I don’t know what his perspective now would be.

These influences, in addition to the incredible excitement that Apple has built around first the iPod and later the iPhone and iPad, have enabled Apple to get to where it is today. I don’t think the halo effect, without the above, would have been enough.

I’m sure I missed plenty of influences. What do you think?

Tags: , , ,

svn diff wrapper for diff(1) and tkdiff

Posted by mitch on June 27, 2012
software

I have been a big fan of xxdiff for many years–so much so that I wrote a Mac GUI clone for it as a mental exercise while on vacation in 2006. Unfortunately, xxdiff with svn has always sucked–it requires awkward scripts to plug it into svn, and those don’t tie into ‘svn diff –diff-cmd ‘ very well. A buddy recommended I switch to tkdiff, and it’s not bad at all. This morning I threw together a wrapper to point my ~/.subversion/config at for 2 file diffs. Since I do a lot of work in both SSH and in X11 terminals, I wanted something that could relieve me of typing ‘–diff-cmd diff’ all day long.

#!/bin/bash
#
# Script to enable 'svn diff' as a GUI diff in X11 or a CLI diff
# on the command line (or if stdout is redirected).
#
# I couldn't get tkdiff to take a -geometry argument. I ended up
# setting my geometry in tkdiff's opts dictionary.
#
# Arguments coming in from svn diff:
#
# -u -L src/hello.cpp (revision 1234) -L src/hello.cpp (working copy) src/.svn/text-base/hello.cpp.svn-base src/hello.cpp
#

dash_u=$1
label1="$2 $3"
label2="$4 $5"
file1="$6"
file2="$7"
# echo "Comparing _$6_ and _$7_"
cmd=diff
if [ -t 1 -a -n "$DISPLAY" ]; then
    cmd=tkdiff
fi

$cmd "$label1" "$label2" "$file1" "$file2" 
RC=$?

exit $RC

Tags: , ,