Welcome to edgylogic, drive:activated visitors. This is my new home on the net.

drive:activated archive

  • Do we really want a SaaS world?

    One of the big IT buzzwords has been software as a service (SaaS). Combined with cloud computing, it has been positioned as a paradigm shift in the way the world interacts with IT. It has changed much of the business world, and is already making inroads into the consumer world. In fact, many companies out there have staked their existence on it, and are doing ridiculously well, including Salesforce and 37signals, and there are others who aren't, e.g. many of those web 2.0 startups that have since disappeared.

    It's appealing to software companies because it gives them a consistent cashflow. In the 'old' model of selling software, software companies had to continually keep releasing new versions with enough enticing features (or fixes) for users to upgrade in order to maintain and increase cashflow. The new SaaS model keeps the money rolling in on a regular basis in the form of client service fees, without the need to continually refresh their product (although those that do generally fare better). They also benefit from better control over their software (as it resides on their servers), so they can patch, update and upgrade as they want, and only support one version - the latest. On top of that, the piracy issue, the one Microsoft spent and still spends millions tackling (what is probably a futile uphill battle), disappears.

    Customers win too, because they don't have to worry about any installation, maintenance, or data security steps, plus they often get the added bonus of having their software accessible from any net-connected computer in the world. For once, users can just concentrate on their doing what they got the software for. Lastly, the pricing structure is psychologically better. Let's face it - many of us are short-term thinkers in need of immediate satisfaction. A smaller price upfront with a long commitment is easier to swallow than a larger upfront cost with no strings attached.

    But, like most things, every upside that seems to benefit all parties has a downside, which is often not in the customer's favour.

    With SaaS, the problem is control. By going with a SaaS solution, users relinquish control over their data and how they use the software to the solution provider. For many SaaS solutions, data ownership is a major question that is often ignored by users, who are still getting to grips with the concept. At worst, the SaaS provider owns your data (probably through some wordy legalese stuck in a Terms and Conditions page only linked to from one page in a hidden link), and at best, you own it. But often, this is left ambiguous or not mentioned at all, and no one really cares either way. Do you really know what your provider can do with your data legally, regardless of whether or not you think you own it? I mean, pretty web 2.0 sites complete with slick effects, glassy graphics and glowing buttons can't be evil, right?

    And even if you do apparently have ownership rights, do you really have the time or money to fight for it if they decide otherwise (often against large corporations with their super lawyers)? Let's not forget that many of these SaaS providers operate in different countries, and therefore our data is subjected to different laws, and not protected by ours. Do we really know the implications of sending that email using Gmail?

    Ok, let's assume you retain ownership of your data. So what can you do with it? Data ownership is really a small part of data freedom. What kind of access do I have to my data? Who is responsible if my data is lost or corrupted? Email is probably the oldest form of modern SaaS. Anyone ever tried moving all their emails from Yahoo! Mail to say, Thunderbird on your desktop? It is a painful task, and one that can't be completely done (what about my sent emails?) This is actually an issue of both the old and the SaaS models though. Many apps on both sides of the fence offer open APIs and specifications which help alleviate this issue, while the more difficult ones are usually fixed using reverse-engineering and/or web parsing. The Data Portability project is trying to help, but it's a pretty big battle, not helped by so many new things coming out, all with working with different sets of data.

    The old model however, has an advantage here - the user's access to their data via the software is not affected by outside factors. As long as the software runs, even if it means running on legacy hardware, the user can still access their data, unlike SaaS solutions, where if the solution is discontinued, or the company folds, then your access to the data will be governed on the access provided by the provider, and the acceptance of their data format by others. It's worth noting here that the openness of the data format really means nothing to your typical non-geeky customer - so what if it is in XML, YAML, or whatever the popular general data representation format is? Some maybe able to plead their case and win over developers to help them out, but most customers won't do that or get that chance. To them, their data is as good as gone.

    That raises one of the primary conceptual differences in buying a SaaS subscription, compared to actual software - you do not own a copy of the actual software, or even a licence to it. You want to use that bit of software, you have to keep paying for it (and there is generally no way around, illegal or otherwise). If you can't pay for it, you can't use the software. As simple as that (ok, you may get leniency from the provider for whatever reason, but that's not the point). In effect, you are renting our software, but unlike traditional appliance rentals where you can choose to pay off and own the appliance, there is no such option here. If for whatever reason you are unable to pay the fee, you'll have nothing to show for all the money we have paid in the past, and your data is stuck in limbo. And if you dislike the new version of the software and prefer the older one, chances are, its bad luck, you gotta change.

    Do the advantages of SaaS outweigh the disadvantages? I can't help but feel short-changed. There is a certain sense of security and pride from owning something that you don't get from renting something. The fact that we have grown up with the notion of being able to own software (or at least a licence to it) doesn't help either - this is a somewhat subtle, but giant shift in the way we acquire and use software, and the awareness of the new set of issues just doesn't exist for most people. From what I've seen, many SaaS companies seem content with capitalizing on this lack of awareness too.

    Purchasing power has also partly shifted from the consumer to the provider; consumer sovereignty has taken a hit. The incentive for continuous innovation and improvement as a result of the consumer's purchasing power has been reduced, in some cases, quite significantly, depending on the service's data portability rules and the availability of competitors. While consumers lose their power, the software companies gains a system which yields consistent cashflow with a weaker link between their product and what the consumer wants.

    It also becomes harder for consumers to vote with their money, as the barrier to change grows due to the SaaS's increased control. Assuming we don't want to pay for two similar services at once, a change over to another SaaS service could quite possibly mean the loss of access to the old software, and possibly the data as well - a cost that is too high for most people.

    Are we, as consumers, really better off under this model, or is this just a ploy by software companies to shore up their financial position?

    UPDATE (1/5/2008): Added a link to the data portability project, and a quick addition to what providers can do with your data.

  • Idea: end-to-end project development platform

    A while back I was brainstorming ideas for a project to keep me occupied during the break. I'm a big fan of brainstorming and mindmaps, so I opened up MindManager (great program by the way) and started stormin' away. Things were going well, and soon I had a few solid leads that I wanted to follow. Being the indecisive person that I am, I asked a mate for some opinions on the leads I had and to add any thoughts he had.

    He didn't have MindManager, and I didn't want to do it via clunky email, so I thought, hey, let's use one of those collaborative web 2.0 mindmapping sites! (It was a real lightbulb moment, I tell ya.) I uploaded the MindManager file on to MindMeister, set collaboration rights, and away we went. Or so we thought - the collaboration feature was still a bit half-baked back then, so it wasn't much better than emailing.

    I downloaded the mindmap back into MindManager (so I can use it when I don't have net access), and started planning the lead I had chosen. Now, as good as MindManager is, it really doesn't cut it for me when it comes to project management (unless integrated with Project or Outlook, which I didn't want to do).

    Without any other project management tool on my computer (and I didn't want to resort to Excel), I looked online. There were the usual ones, including Basecamp, goplan, huddle etc.

    Thinking that this was a real hassle, and a bit of forward thinking told me it was only going to be harder,  I realised I had discovered a real issue when it comes to executing projects. There is no service available out there that provides a complete feedback loop for a project, from the brainstorming and development stages, to the delivery and customer feedback stages, and all the way back to the starting stages for version 2.

    The only solutions out there only solve parts of the loop - e.g. MindMeister for brainstorming, Basecamp for project management, kluster for crowdsourcing and feedback, maybe ning for a social network/forums, wordpress for a blog, the list goes on - but none that integrate the entire loop together. But that's the way it should be - integrated. Ideas that were brainstormed and accepted should appear on the project todo list, feedback from a prototype should flow back to the brainstorm and management stages for improvement, and frequent problems reported in forums should be added to the planning stages for resolution in the next version. We shouldn't have to jump through hoops to be organised - it should be done for us. And users should feel like they're part of the process, not a victim of the numerous systems in play to solicit feedback which probably gets lost anyway.

    In other words, its crowdsourcing meets project management. And for those who have been reading, yes, I called this project, loopboard.

    Features could include:

    • Brainstorming
      • Mindmapping
      • Whiteboards
      • Lists

    You could make it so you can insert 'dynamic' sections into your mindmap/whiteboard, so say, the top 10 bugs are visible and become part of your planning process automatically. Or maybe a dynamic section with the top 10 news items from your competitors.

    • Project management
      • Milestones
      • Goals
      • To-do
      • File-sharing

    Maybe even integrate with basecamp via their APIs for this. Be able to selectively make things public so not all your plans are revealed if necessary.

    • Crowdsourcing
      • user profiles
        • friends
        • roles in projects
        • fans of projects
        • skills
        • personal details
        • blog
      • discussions - will be tightly task-driven, e.g. anything pertaining to a particular milestone will be associated with that milestone, anything about a particular to-do list item, will be linked with that. Will have discussions page that draws out all the discussions going on, but it will be obvious when viewing a particular thread what that thread is linked to. Will have quick interface to translate a discussion point into an action task.
      • project profiles
        • description
        • team members
        • goals
        • announcements (facebook wall-like)
        • blogs (project plus relevant team member entries aggregated)
        • needs (e.g. investment, manpower)
      • wiki/knowledge base
      • ideas - complete with a voting system

    Take a look at kluster - seriously, that's probably the best crowdsourcing platform I've seen by far. I love the cute jargon, the interface, and the investment models and algorithms.

    You should be able to choose whether to make your project public, invite-only, or private. Users can have different roles, with different rights. Although to really harness the power of crowdsourcing, the project should by public, and accept input from everyone.

    • jobs and services directory - companies can pay to advertise and gain exposure to the projects; members can advertise their skills as well.
    • specialised application modules or APIs to allow for that to happen
      • software bug tracking systems
      • donation tracking systems
      • review systems
      • voting systems
      • specialised feedback systems 

    The platform should be as general as possible, so that it can be used for software projects, fundraising projects, or even projects like opening a new restaurant. Although it's probably worth concentrating on particular uses at the beginning.

    Yes it is a project with a very wide scope, and is maybe trying to do too many things at once, but from a user perspective, it's a big step up from what we have right now, which are generally disparate systems only connected by us spending time and effort.

    Like the idea I posted a few posts back, this one's free to a good home as well - I'd definitely be interested in something like this though.

  • Why can't hardware makers do software?

    I just had to fix a problem on a computer with Windows Live Messenger unexpectedly closing due to DEP violation, and guess what the problem was - Acer's eDataSecurity Management tool was doing something screwy with it. I gotta give them credit for providing a fix to the problem (problem and solution here), but why do problems like these happen time and time again? And even when the software works, it nearly always seems to be some clunky prototype someone was paid peanuts to do.

    It annoys any IT guy to no end the amount of crap desktop/laptop manufacturers preload on their systems. The first thing I do for the machines we do at work is unload all that crap, or do a complete OS restore without those bits. I'd be interested to see how many people actually use the crap that's generally on them. Do they really think they're adding value? Who the hell buys desktops/laptops based on the additional crap it has on it?

    That crap is generally unsupported too, written by the new guy in the office with the shit tasks list, and as long as it worked the day he wrote it, it's assumed to be rock-solid and will work anywhere for eternity. Great.

    If they want to add value by preloading software, do it well, do it unobtrusively, give us the option if whether or not to use it, and respect the fact that I don't want it scattered everywhere on my system (and also good Windows programming guidelines, i.e. don't write stuff to the Program Files folder!).

    Motherboard manufacturers are repeat offenders too. Here's a screen dump of my MSI mobo's temperature monitoring app:


    Why do they always look so ridiculous? This isn't the first motherboard I've seen with a monitoring app like this - Gigabyte, Asus, Soltek all have apps that look like this or worse. Is there some kind of competition between mobo manufacturers to try and make the ridiculous-looking monitoring app ever?

    Looks aside, I thank my lucky stars every time this thing loads up and doesn't make my computer spontaneously restart after a few minutes. You have no idea how nervous I get when I have to use their utility to do a BIOS update...

    Many peripheral makers aren't much better either. Here's one from my Dvico FusionHDTV card.

    Don't get me started on how unintuitive it is (reservation? automatic date maker? where's the bloody close button?) - I'm just thankful it works. It took them years to get the software to this working state - the software I got with it originally was unworkable to say the least. In the mean time, I got all sorts of weird 'UI enhancements' from the following versions, but the software still did all sorts of weird things until they picked up their game a bit recently. Yet I still have no confidence in pressing the 'record TV' button, and dread the day I have to reinstall it on another computer.

    A friend's Linksys CIT200 Skype phone is the same - when that app is loading, everything in XP freezes until it's done.

    The pain isn't on and off though. It's full on. There seems to be some implicit rule that they all have to have as much crap as they can get away with running on your computer on startup. That MSI program above automatically loads on startup after install. Others like Logitech, add innocent things like an update checker (I've lost count of the number of update checkers I've had to disable - I wish Windows had the Linux package system sometimes, even with its annoying quirks) to the startup list. Others add more annoying things like services that are useless unless the desktop app is running, while the more evil ones add apps that sit in the task bar quietly chewing up CPU and RAM until your computer slows to a crawl.

    Toshiba's Bluetooth stack is a serious offender here - unlike the Microsoft stack, it requires an app for each BT protocol to be loaded on startup, each of them taking up valuable resources that could be shared, even if my BT is switched off. In fact, while I haven't looked on other laptops lately, Toshiba is a serial offender to startup apps - my laptop had over 10 apps attributable to Toshiba loading on startup when I first got it, some useful, others not so. Regardless - there must be a better way to manage all these functions without all this useless overload.

    Why is the software component nearly always an afterthought when it comes to the overall product? Is it because its not physical therefore it doesn't matter? Or are the hardware guys somewhat superior, and the software guys are just minimum-paid dropkicks hired to build software because the company had to? In most cases, the software is as important as the hardware is, so it doesn't matter if you have awesome hardware but your software sucks - bad user experience overall (and unfortunately Windows cops most of the blame due to the nature of the bugs, and hence the hardware manufacturers have no incentive to fix it).

    Whatever it is, if I'm ever in the hardware device industry, I pledge to give software the attention it deserves Big Smile

  • Idea: online personal trainer marketplace

    A while ago, a mate and I were brainstorming ideas to link in his work out/fitness/health/sport fetish with the web, with the view to form something financially viable out of it. We saw lots of fitness community websites out there like traineo, gimme20, peertrainer and sparkpeople, but they're all focused on bringing ordinary people together. Problem is, there is no real authority like a personal trainer, available on these websites to provide personal attention and reinforcement that what they're doing is right for them and their goals.

    The personal training, or health and fitness market in Australia is booming at the moment as the media continuously tell us 24/7 we're in the middle of an obesity crisis (which, as a uni student, seems like complete crap if uni students are any representation of the wider community). So in response, personal training is being spruiked as some kind of wonder job, where you can be in complete control of when you want to do what where, yet still get 'great pay' (the only ad worse than those radio ads are the sex help ones). Heaps of people are doing these courses, and if my bets are anywhere close, not all the graduates are experiencing the good life promised.

    This led us to the idea of an online personal trainer marketplace. After all, personal trainers who choose to go it alone can place an ad in the paper, or on ebay, or some other classifieds website/newspaper, but honestly, how many people look there, and if they do, will they have confidence in the transaction?

    Kind of like onforce.com for IT technicians, the personal trainer marketplace, codenamed reboundNOW (domain not available anymore), hooks potential clients up with trainers. But it doesn't stop there - it provides a full set of tools to faciliate clients to carry out their routines and to keep track of them, and for trainers to be able to keep an eye on their clients to make sure they're doing the right thing. And of course, it'll have some nicely integrated community, social networking magic sprinkled on top to pull people together.

    So as a trainer, you'd add a profile on reboundNOW, providing information on yourself, qualifications, specialities etc. You may even choose to start a blog or podcast with tips or journal entries to help assert your skills. Your qualifications will be checked by reboundNOW, and marked as such when passed. Trainers will be charged to list their profile for x months, and then a small fee for each client they enlist. They are free to set their fees as they wish on top of that.

    As a potential client, you can then look through all the profiles, filtered by location, skills, sex, specialities etc. - kind of like a dating site, but with more class :) You would have the ability to talk to them before committing, or maybe even talk to their existing clients. Once you've decided, you can then work out pricing and other details, confirm the relationship and organise a fitness survey.

    You will then have access to the extensive exercise library in-built into reboundNOW, complete with video demonstrations, as well as a useful exercise tracker that lets you keep track of your routine and your performance and changes over time. You can print out the work out to take to the gym, then report the results back into the system, along with any comments.

    Your new trainer will have access also, allowing them to update your routines to match your performance and needs, as well as communicating with you to see how you think you're going. They can also prescribe exercises not in the library by adding their own.

    If you choose, you can also make your exercise tracker public, or available to certain friends so they can keep track of you and motivate you when you need that extra boost.

    There will also be a shared calendar between you and your trainer, allowing you to block out times that you're busy, so you can work out when you can meet in person for a session.

    Other functionality may include:

    • groups - can be groups between people with similar interests across reboundNOW, or groups for group sessions with the trainer.
    • events - allows people or groups to easily organise and hold events, complete with mapping and suggested routes.
    • forums - again, can be sitewide, or trainer specific; allows users to talk, share stories, tips, ideas and provide motivation.
    • payments - makes it easy for clients to pay for their trainer's services and/or products, allowing trainers to accept payments from credit cards without further hassle.
    • business analysis - for trainers to see how their numbers stack up easily, and helps with any business requirements they may have.
    • store - a one-stop shop for fitness-related products.
    • diet - a recommendation and tracking module for the client's diet.
    • goal setter - allow trainers and clients to set goals and track achievements together.
    • measurement tracker - track measurements like weight and fat index over time, and visualize improvements.
    • body image tracker - post images and see changes to your body over time.
    • competitions - challenge clients (sitewide, or trainer specific) to a particular goal, and be rewarded for achieving it.

    There are heaps of other things that are possible, but the goal is to keep clients and trainers using the website even after they have met. They won't be prevented from sidestepping the website, but hopefully the features will provide enough benefits to both trainers and clients such that the minor extra cost will be worth it.

    The monetizing strategy is fairly different to most other fitness community websites, which generally focus on either a direct cost to the client for the website features, or advertising. reboundNOW instead offers something that can't be replaced by another website - a trainer. And as the site offers benefits to both the trainer and the client, the relationship is stickier.

    The closest website we found that did this (mind you, this brainstorming was done a while ago), was iTrainHarder.com. However, iTrainHarder does not have the concept of a marketplace - trainers can bring their clients on to the platform and get similar features, but it doesn't seem as community oriented, and rather than try to encompass the entire relationship to make it a one-stop-shop, it only focuses on certain aspects.

    I think there's potential in this, if it's executed well, in consultation with people in the industry. My mate and I never really got round to getting this off the ground, and we're now both engaged in other endeavours, so the idea, which I'm sure isn't exactly new, is free to a good home :)

  • Recovering Toshiba M200 XP to a partition

    Being the prepared person that I am, I have been thinking about uni this year, and getting all the things I need done before it starts done now.

    For the last year and a bit, I've been using my Toshiba Protege M200 tablet as my main note-taking device, and its been working great (see this post). During my break, I upgraded to Vista, and am generally very happy with it (then again, Windows has very rarely caused me issues).

    The two major problems I have with it though are performance, and battery life. Performance isn't that bad, though it definitely is slower than XP, but I find the machine more responsive (i.e. it doesn't completely freeze up while some program is doing something). Battery life has taken a hit though, cutting an hour or so off what I usually get. This is a problem if I'm gonna use it at uni, so I've decided to install XP and Office in a tiny 6GB partition.

    Sounds easy, but actually isn't, especially given the crappy Recovery CDs provided by Toshiba (if indeed, you got any).

    I'm not going to cover all the steps involved here, only the bits that I haven't found documented very well. This is a fairly geeky process, so only proceed if you know what you're doing. No guarantees either; in fact, expect your machine to be bricked. Do read the entire post before you start, because there are some pre-requisites that you need.

    Quick checklist of what's needed:

    • Norton Ghost (particularly, Ghost Explorer; comes bundled with some editions of Norton SystemWorks)
    • Toshiba USB floppy drive (other branded ones may work)
    • USB CDROM drive
    • a computer with a CD burner
    • floppy disks - at least 6
    • internet access, preferably throughout so you can hit google when you're stuck, and
    • patience; hopefully not too much.

    Backup first.

    I used Norton Ghost and ghosted the entire hard disk to another computer - plenty of guides out there, and fairly straightforward anyway. Alternatively, you can use Vista's in-built Complete PC Backup tool (if you have Vista Business, Ultimate or Enterprise) - instructions at http://www.bleepingcomputer.com/tutorials/tutorial145.html.

    You will need a USB floppy disk drive for Ghost (unless you use the SD card image trick - save floppy as an image, then name the image $TOSFD00.VFD, select the SD card as the boot device on the M200 and it'll work).

    Be warned that chances are, Vista will not boot when you restore the Ghost image due to the way the boot loader works, so you'll need some way to fix that. And if you used Vista's in-built Complete PC Backup tool, you'll need this too, to access the restore functionality. The best way is to load the Vista Recovery Environment from the Vista install CD, but you'll need the Toshiba proprietary drive for that (as crappy Toshiba didn't allow booting from standard USB CD/DVD drives). This makes recovery very difficult, as Vista basically expects a bootable CD/DVD drive (and why not, my old Celerons could do this). One trick is to download the Vista Recovery Environment ISO file here, create another partition, mark it as active (boot partition), make that partition grub4dos bootable, then place the ISO file in that partition and boot into the ISO using grub4dos. The grub commands are:

    • map --mem (hd0,3)/Vista_Recovery_Disc_x86.iso (hd32)
    • map --hook
    • chainloader (hd32)
    • boot

    Amend accordingly if your partition numbers are different, or the ISO file name differs. Once in, click Repair your computer, accept the repairs, restart and you should be on your way. This is a pretty nasty workaround, so fingers-crossed you won't need it.

    Partition the hard drive

    There are a number of ways to do this, but the best way I think is to use Vista's in-built disk management tool (other tools tend to bugger up Vista and require a Vista repair to get it working again). Make sure you have enough space first.

    See the Shrinking Windows Vista Partitions or Volumes section at http://www.bleepingcomputer.com/tutorials/tutorial133.html. When you confirm the shrinking action, it isn't terribly informative as to what it's doing - the only indication is the busy spinny circle in the disk management window. Be patient, and when it stops spinning, it's done.

    If, like me, you found that the amount you can shrink by isn't enough, see the instructions here - http://www.howtogeek.com/howto/windows-vista/working-around-windows-vistas-shrink-volume-inadequacy-problems/. For me, the most effective solution was to download, install and use Perfect Disk to defragment my hard drive, using the Offline method (note: if a reboot is required using this method, the screen may stay blank when booting up - this is normal and is Perfect Disk working its magic; be patient). Then try shrinking process again.

    When considering how much free space you'll need to allocate for XP, keep this in mind - A standard Windows XP Tablet PC Edition install takes about 2.5GB; and both the page file and hibernation (if you choose to active either), will take up space equal to the amount of RAM you have installed individually. Office 2007 takes around 900MB (due to the installation files being stored locally), and you should leave at least 1GB free space for Windows to work with. I chose 6GB, with both the page file and hibernation disabled.

    Try to allocate enough space now, as it is very difficult to reallocate later. If you try using tools like GParted (as of parted 1.8.1), you may find that your partitions don't boot anymore because it doesn't always copy the NTFS partitions properly, resulting in errors, and it doesn't copy Windows XP's boot loader properly either, resulting in an invalid boot.ini error. These can be fixed using XP's Recovery Console though which can be booted from the boot floppies you're gonna create.

    Once you've made enough unallocated space, right-click that space and select New Volume, and create a new primary partition taking up all the space available. Use NTFS as the file system. Once that's done, I suggest you label the partition by right-clicking it, selecting Properties and typing something into the text box at the top. This way, you can easily identify the XP partition.

    Creating your XP CDs

    Here comes the trick. You may be thinking - can't I just use the recovery process on my recovery CDs? The answer is no, because that process assumes you want to wipe your entire HD, which you don't want to. So what we have to do is recreate some XP CDs - which we should've been given instead of stupid recovery CDs grrr.

    Credit for the following instructions goes to here - http://www.gottabemobile.com/forum/printer_friendly_posts.asp?TID=339 (part 2). Below is an adaptation of those instructions for the M200.

    If you're lucky, you would've gotten a set of 4 recovery CDs in the box with your M200. If not, stop here, find yourself a set (or torrent for them) before continuing. If you have a complete drive backup of your Windows XP installation, that'll suffice.

    Perform the following instructions on a computer with a CD burner.

    1. Create a folder on your desktop called M200BaseImage.
    2. Insert recovery CD 1, navigate to the BASE folder inside, and copy all the .GHO and .GHS files into the folder you just created.
    3. Repeat for the other 3 recovery CDs (28 files in total).
    4. Create two folders on your desktop, named VRMPOEM_EN and VRMPOEM_EN2.
    5. Open up Ghost Explorer, go to File -> Open, navigate to the M200BaseImage folder, and double-click on PREINST.GHO.
    6. It'll now ask you for the password. If you have the recovery CD set with the P/N TR04C11E*CD, where the * is the CD number, the password is 2566 (thanks to the 'crack' procedure at http://weller.ws/toshiba/norton.html). Otherwise try the other codes available on that page, then the hacking procedure as a last resort.
    7. Click Yes to the Warning message about spanning.
    8. Once it has finished loading, copy the folders SUPPORT and VALUEADD to the VRMPOEM_EN folder by dragging them (or copy & paste).
    9. Copy the folder CMPNENTS from the image to the VRMPOEM_EN2 folder.
    10. Now navigate inside the WINDOWS folder in the image, and copy the folder I386 to the VRMPOEM_EN folder.
    11. Still inside Ghost Explorer, go into the SYSTEM32 folder, find the file wpa.dbl and copy that to the VRMPOEM_EN folder also. You can close Ghost Explorer now.
    12. Now from a Windows XP SP2 Install CD (upgrade or full version, doesn't matter), copy everything except the SUPPORT, VALUEADD and I386 folders, and wpa.dbl. Leave the XP CD in the drive.
    13. To make the process easier, you can choose to create a folder called DRIVERS inside VRMPOEM_EN2 and place the XP drivers for the M200 there, in particularly the network card drivers so you at least have internet access to download others. Otherwise, you can use a USB thumb drive instead. Drivers available from http://www.isd.toshiba.com.au.
    14. You now need to extract the boot image from the XP CD. This can be done using a utility like ISOBuster, or Bart's Boot Image Extractor. The file is generally called Microsoft Corporation.img. Place this file on your desktop. The XP CD can be ejected after. If using ISOBuster, see step 4 in these instructions for help - http://www.winsupersite.com/showcase/windowsxp_sp2_slipstream.asp.
    15. Just as a final check - the VRMPOEM_EN folder should contain: DOCS, DOTNETFX. I386, SUPPORT, VALUEADD, AUTORUN.INF, README.HTM, SETUP.EXE, SETUPXP.HTM, WIN51, WIN51IP, WIN51IP.SP2, wpa.dbl. The VRMPOEM_EN2 folder should contain - CMPNENTS and DRIVERS if you chose to put the drivers there.
    16. The CDs are now ready for burning. Insert a blank CD, and using your favourite CD burning software, burn a CD containing everything within the VRMPOEM_EN folder, with a volume label of VRMPOEM_EN and a boot image set to the file extracted in step 14. If you're using Nero or Easy CD Creator, step 5 at http://www.winsupersite.com/showcase/windowsxp_sp2_slipstream.asp may help.
    17. Burn another CD containing everything within the VRMPOEM_EN2 folder, with a volume label of VRMPOEM_EN (note the number 2 is not wanted). This CD doesn't need to be bootable, so don't worry about the boot image.

    You have just recreated the generic Windows XP Tablet PC Edition 2005 CDs.

    Creating boot floppies 

    Problem is, you can't actually boot off them (unless you have the proprietary CD drive), so you'll need to create some boot floppies (yes, you'll need a floppy drive - alternatively, try the network PXE method; much more complicated). There's 6 in total, and they're available at http://www.microsoft.com/downloads/details.aspx?FamilyId=535D248D-5E10-49B5-B80C-0A0205368124&displaylang=en.

    Make the XP partition bootable

    I deliberately separated this step from the partition step in case people don't read the instructions through first. If you've managed to complete all the above steps successfully, boot back into Vista on your M200, and return to the Disk Management console. Right-click on the XP partition and mark it as active (right-click on partition, click Mark Partition as Active). Note, this means you won't be able to boot into Vista anymore, unless you reverse the active flag back using a boot disk. This step is necessary for the XP installation process.

    All set, let's install XP

    Alright, now connect your USB CD drive and floppy drive to your M200, and boot it up using the XP boot disks (you might have to go into the BIOS and change the boot order, or mash the ESC key on startup to force it to boot from the floppy).

    Once in, it's a standard XP installation process. The product key it asks for is located on the bottom of your M200. At some point, it will ask for CD 2 - insert it. Then it will ask for CD 2 again; this time, re-insert CD 1.

    If however, you can't boot using your XP boot disks, and get errors like UNMOUNTABLE_BOOT_VOLUME, first check that your boot floppies are not defective by running a disk check (happened to me). Then check that the hard drive inside the M200 doesn't have disk errors by running another disk check. If there's still an error, try formatting the Windows XP partition again - you'll need to use a boot disk for this now (http://bootdisk.com/).

    When XP is all done, install drivers, setup desktop etc., then right-click on My Computer, select Manage, click on Disk Management, right-click the Vista partition and select Mark Partition as Active. Reboot.

    Setting up dual-booting

    On reboot, Vista should boot up again. You shouldn't have to switch active flags each time you want to boot into XP though, and you don't have to.

    Once Vista has booted up, download and install VistaBootPro.

    Run VistaBootPro. It should tell you you only have 1 OS installed, Vista. Click on Manage OS Entries, check the Add New OS Entry checkbox near the bottom, type in a name for your XP installation (only for your identification purposes), select Windows Legacy in the OS Type box, and in OS Drive, select the drive letter for XP as seen from Vista (the letter is actually irrelevant; it gets converted to the disk offset value the partition starts at - the drive letter is only so you can identify a partition). Click Apply Updates. Now on reboot, you should be able to choose between XP and Vista.

    If luck was on your side, it should work like clockwork. If not, well, all I can say is, I'm glad I'm not you, seeing as I spent the last few days trying to get this working, the hardest bit being the lack of a bootable CD drive - stupid, stupid, greedy Toshiba Angry.

  • The reality of clean-feed internet filtering

    This isn't the way I'd like to start the new year here, but it's creating a storm and it's something that has the potential to affect everyone as internet users.

    For those who have been out partying it up over the new year and haven't yet caught up with Australian news,

    Senator Conroy [Federal Telecommunications Minister] says it will be mandatory for all internet service providers to provide clean feeds, or ISP filtering, to houses and schools that are free of pornography and inappropriate material.

    "If people equate freedom of speech with watching child pornography, then the Rudd-Labor Government is going to disagree."

    Senator Conroy says anyone wanting uncensored access to the internet will have to opt out of the service.


    Australian bloggers are up in arms, some completely disagreeing with the proposal and think we're on the slippery slope of internet censorship, while others are commending the government for taking a such a strong stance against undesirable content on the internet.

    Let's take a moment and step away from the censorship argument. I'm not going to tackle that because it has been done to death, and there is simply not enough information to seriously discuss that scenario without speculation.

    Instead, I'm going to run through the reality of a clean-feed solution.

    In the real world, we have classification in Australia for TV shows, movies, magazines, games, music and other types of media, provided by the Classification Board. Depending on the type of media and it's usage, classification may be required, and performed by the Classification Board, and disputes can be brought up with the Classification Review Board. Anything that needs to be classified and is not or has been refused classification can not be sold.

    Here are some statistics on what they classified, 2006-2007:

    • 214 publications
    • 402 films for public exhibition
    • 4,555 videos or DVDs for sale or hire
    • 890 computer games
    • 28 Australian Communications and Media Authority Internet referrals, and
    • 134 enforcement referrals


    The internet however, is generally unclassified. So it would make sense that we classify it, so we can block illegal content, and prevent questionable material from getting into the hands of those who shouldn't be viewing it, right? After all, no one can argue that people should be able to access child porn, and that 10 year old kids can watch hardcore porn videos.

    Thing is, the internet is unlike any communication medium we have had before. Anyone can contribute content, wherever, whoever, whenever they may be, and that content is available to anyone in the world. There are no hard statistics on the amount of content on the internet due to the distributed and dynamic nature of it, but Netcraft, an internet monitoring company, recently counted 155,230,051 active websites (not pages) as of December 2007. I'd say that's a fairly conservative figure, and growing significantly by the second.

    Keeping that in mind, let's look at classification and filtering. There are 4 ways of doing it.



    This is when the filter has a list of good sites, and anything that is not in that list is blocked.

    This is how classification in the offline world works right now - they assess material that requires classification, and if it is refused classification, it is banned. It works because every year, there is generally only about 5000 or so items that need to be classified, and once material is classified, that material never changes.

    Internet 'material' however is completely different.

    To start with, there's the sheer enormity. 17 Classification Board members classify 5000 items a year currently. If we assume there are 155,230,051 public websites out there, and all need classification because they can be accessed by anyone, using the current classification workload figures, we would need 527,782 board members to classify them all. Imagine trying to keep 527,782 people consistent, and the bureaucracy that gets created. Also, any new sites that just popped up will be blocked, until the classifiers get around to it - say goodbye to access to the latest web apps, information, content.

    Then there's the dynamic nature of internet content. A web page does not stay constant once it has been published, unlike a book or a movie. It can be edited (or hacked) at any time to anything. Or it can dynamically draw content in from other websites. Therefore classification is effectively useless because by the time it has been classified, the content has changed so much it probably needs reclassification. The advent of enormous amounts of user generated content makes this even more ridiculous - try classifying every photo on flickr, a popular photo sharing service that has some risque shots in there, all the while thousands of new photos are being uploaded every hour.

    Of course, we could just permit a small selection of websites, but that amounts to censorship - why does website X get permitted, but website Y doesn't? It still doesn't tackle the dynamic nature of the internet, and removes one of the best things about the internet - you can find information about anything you want on it basically. Have a look at these stats from the UK's clean-feed service - http://www.cleanfeed.co.uk/catstats.php. They have only classified 9 million of the 155 million active websites out there.



    This is the opposite of whitelisting - if the site you're trying to access is on the list, then you are prohibited from accessing it; sites not on the list are allowed.

    This approach suffers from similar issues to whitelisting. In order for the blacklist to be effective, every site would need to be reviewed, or at least a large proportion because otherwise you would easily be able to go to another site for content that was blocked. It would need to be constantly updated as well, because of the dynamic nature of the internet.

    In addition, it is near useless against internet proxy servers too. Internet proxy servers are similar to a proxy in real-life - they act on behalf of someone else. There are thousands of proxy servers available on the internet for such functions, with many changing on a day-to-day basis. Blacklisting is ineffective against proxy servers because using a proxy hides the real request.

    For example, let's say www.porn.com was blacklisted. If I try to access www.porn.com I would be blocked. If I used a proxy however, I would instead be accessing www.proxyserver.com and telling it that I wanted to access www.porn.com. The filter would think I was access www.proxyserver.com and let me through.

    Sure you can blacklist proxy servers, but with so many popping up, and things like distributed proxy networks like Tor where anyone can be a proxy for anyone else,  it's a futile task. And some 'proxy servers' are actually useful - for example, you can turn Google Translate into a proxy server, by telling it you want to translate a particular web page in English, from Spanish to English. It looks for Spanish words in the page, but finds none because the page is in English, and returns the page to you, all the while the filter thinks you're just accessing Google Translate.


    Content filtering

    Instead of having humans classify every website, this approach allows classifiers to set particular bits of content to watch for, and if they exist or enough triggers exist, the website is blocked. The content may be certain words or phrases, or certain types of images (e.g. ones with significant amount of area in a skin tone colour). So instead of humans doing the filtering, computers are.

    The problem with this approach is that computers are dumb. They see the world in black and white, when it is in fact grey. They do not understand what they're filtering, and hence they miss any contextual significance. For example, is an image with significant amounts of skin tone a pornographic image, or an image of a medical condition? Or is it a web page with instructions on how to build bombs, or a web page on the chemical reactions of particular substances? 

    However, this is the only approach that is at all feasible when it comes to filtering the internet, because it is the only one that can handle the enormity of the internet. There is a lot of research going on to teach computers to understand human languages, and to develop algorithms that are smart enough to deduce the meaning of documents, images, videos and other media. But it's a long road, and there's still a lot to be done before it is useful. And remember, we don't stand still either - our language and way of expressing ourselves is constantly changing.

    It's worth noting that encrypted websites are immune to this type of filtering.

    A combination of the above

    Most internet filtering systems use a combination of the above approaches. But as you have just seen, none of them are in any way effective on their own, so combining 3 systems that don't work doesn't result in one that does.

    So, now what?

    You may be tempted to say, so what? At least we're doing something to stop our kids from accessing porn and violence, or the propagation of child pornography.

    Stop kidding yourself. The enormity and dynamic nature of the internet means classifying and filtering it effectively isn't too hard. It's impossible. If you think a filter will be effective in protecting your kids from porn, or people from accessing illegal material, think again - proxies, secure tunnels, google, hacks. And even if you don't think your kids or next door neighbours are smart enough to work it out, they'll know people who do.

    Think it through thoroughly before forming your opinion. Go to google and do a search on the most random thing you can think of, then do one on the most common thing you can think of, and consider the effort needed to classify that, and everything in between. Then get your kids to show you or sign up on to facebook yourself and see all the avenues that content can be contributed and changed.

    Internet filtering is a blunt instrument. Consistently banging that hammer on the internet will hurt those who legitimately use the internet much more than those who use it for objectionable purposes. This is amplified due to the opt-out nature of the proposed internet filtering - we all know few would be bothered opting out due to laziness, and/or the implication that because they're opting out, they want to access child porn.

    Think of the implications of someone who earns a living from their online store, yet it is suddenly blocked a few weeks before Christmas because of a questionable comment a user made on a product. That person could potentially miss out on the entire Christmas period, depending on how fast the classifiers re-review their site. Their name may also be forever tarnished, as users find out the site has been blocked, implying it contains pornographic material. On the other hand, if you block one child porn site, they'll just respawn under a different address and away we go again.

    Filters may form part of the solution, but it isn't a silver bullet. There is no way to control all the content the way we do offline, online. The fact that anyone can post content on to the internet is both the internet's strength and weakness. It revolutionised the way we communicate, with ability to broadcast our thoughts, views, information, content to anyone in the world who cares to look at it. And it will become more and more important as we explore and adopt ways of using it to make our lives easier. Yet on the flipside, we are finding objectionable content is more freely available, as those propagating such content exploit the internet as well.

    The internet is a new medium, unlike any other. We need to treat it as such, stop applying ideas that worked in other mediums, and think of new, radical, innovative ideas that cater for the internet's unique abilities. We need to educate the public, and tackle the ignorance people have about the internet, the ignorance that political groups and companies with vested interests are exploiting so well, the ignorance that will bite them when they realise they have been tricked.

    If you're on facebook, join this group and let's work out a solution that actually works.

  • Another customer service screwup

    Just got off the phone with Lenovo's post-sales line to cancel a new laptop order, after a ridiculously long wait for delivery.

    What happened? Read on...

    Back in October, a mate of mine was looking at getting a new laptop, and I managed to convince him on a Lenovo X61. It wasn't the utilitarian design or the annoying nipple mouse (which he didn't mind) that sold him, but rather the cool videos over at LenovoVision, demonstrating the durability of the machines.

    (You gotta admit, those videos are pretty cool. The guys at work love them, and while I'm not completely sold on them yet, the durability and the awesome battery life  - 6-8 hours realistically under light conditions - gets me pretty close to getting their X61 tablet.)

    Best of all, Lenovo were running their anniversary sale promotion, with significant discounts on all their machines. So on October 30th, we ordered one.

    The expected delivery time as stated in the confirmation email was 1-3 weeks (1 week unlikely), which is too long for a simple pack-and-ship job if you ask me, but fine, that's what it took. This meant my mate was gonna be overseas, so we arranged for it to be shipped to me, and I'll bring it up to him when we catch up.

    So November comes and goes, but no courier rings my doorbell at 7am in the morning (that's generally when packages are delivered for me for some whacko reason).

    Checking on the online order status page, it now says it's 'scheduled to ship' on the 3th Dec, and should arrive on the 13th Dec.

    Dec 20th rolls along and still no laptop - couriers may be delayed, but generally not by a whole week - so I check the online order status again, which still says 'scheduled to ship' on the dates as above. Useless site and bad expectation management - if you tell me a certain date, I expect it to be here on that date (Amazon does this better - they make sure the expected date is a week more than usual). Screwup #1.

    Something was wrong, so I called up their post-sales line. No voice recognition (thank god), only simple number-punching menus, the combination which was given in the order status page, so it wasn't too painful. Once you get through and you get put on hold, after a minute or so, it gives you an option to leave your name, number and a short message, and an operator will call you back within 4 business hours. I've used this system before at other places, and I like the option - you can get on with your life, and they'll call you back when they're ready, so everyone wins.

    The problem is this - they don't call you back. Don't offer the option if you're not gonna go through with it. Screwup #2.

    To make things worse, once they put you on hold, after a few minutes, it forces you to leave a message - you can't choose to remain on hold. I'm willing to stay on hold and listen to the same crappy song over and over again to the point that I'm actually humming along; just put me through to a damn person already!

    So I can't stay on hold in the call queue, and they don't call you back - just fuckin' great, it's impossible to get to someone, if there is someone there. Screwup #3.

    The only concession in all this is that the number is a 1800 number, toll-free from landlines. So I thought if you're gonna screw me over, I'll screw you over the only way I can - repeatedly call that number. And I did, leaving the phone on auto-redial.

    Then shock-horror, after an hour and a bit, I hear some random person's voice - I was through to one of the prized operators at Lenovo!

    The guy couldn't sound more pleased to talk to me - monotone, quiet, completely disinterested and probably half dead. In fact, he was kinda surprised when I spoke, completely missing every single word I said as he woke up. I guess I should be thankful I didn't get put through to some guy at an Indian call centre, who gets so excited when someone picks up they speak at 100 words a second in a mesmerising rhythm.

    Apparently the order was delayed until early January 2008 because of... guess, because you'll never get it.

    A Vodafone SIM card (the laptop has an in-built 3G/HSDPA modem).

    Yes, the wankers at Lenovo held the rest of the package at a warehouse somewhere, because they were missing a shitty piece of plastic that offers me nothing except some crappy introductory offer and more pre-installed crap. 

    Ok, so maybe if Lenovo didn't ship it, they'll probably miss out on a shitload of money from Vodafone, who pay them for the bundling. But is it worth pissing off all your customers who ordered an X61 direct (all X61s ordered direct were affected) just because of this? You couldn't just ship the package sans the SIM, and send out the SIM later? Or if you're really fuckin' cheap and don't want the extra shipping costs, at least send an email to your customers letting them know the situation! The online order status website is there for a reason too - use the damn thing! Screwup #4.

    To make matters worse, the guy was reluctant to do this even when I nudged him in that direction - apparently the 'system' doesn't allow him to split up a package. WTF - you're a computer company previously attached to probably the largest IT consulting company in the world, and your 'system' can't handle split packages? And the guy sounded like he couldn't care less when I cancelled the order (because I won't be around to pick up the package in January 2008) - just another day at the office. He didn't try at all to offer me other options, or some other incentive to not cancel. Screwup #5.

    Wow. I don't think you could screw up the ordering process any more for a potential customer unless you really tried. Of course, this small order is only a drop in the bucket for them, with their multi-million-dollar orders daily, so it means nothing, which really sucks for everyday consumers.

    On the flipside however, because my mate still wants the X61, he got a friend in the US to get it and bring it to him when they meet up, and guess what - it's around a grand AUD cheaper over there, even after you add all the bits to make their X61 models similar to the Australian X61 models (larger HD, Vista Business, fingerprint reader, wireless-N supported card) - for some odd reason, it seems the US X61 models are all lower specced in their default configuration.

    So yeh, Lenovo still gets his business, but it's disappointing how crap Lenovo Australia are. 

  • The state of mobile broadband in 2007

    With the view of going to RMIT next year (I've officially become a number there a few days), plus continuing work, I'm going to be spending a fair amount of time in transit. So I can fit everything in, and bypass RMIT's crappy internet rules (no non-RMIT email access allowed - WTF), I've been looking at getting a mobile broadband service.

    The two types of mobile broadband available here are either 3G/HSDPA based, or wireless based (e.g. WiMAX). Wireless based ones are potentially faster, however they have a more limited service area. 3G based ones on the other hand, can connect to GSM networks when out of a 3G service area, providing GPRS speed access and hence greater coverage.

    Some key questions:

    • monthly cost and data limit (excess charges or shaping)
    • contract length
    • modem cost, ownership and locking after contract length (applicable to 3G-based networks, as these modems can be used across 3G-based networks)
    • data limit includes uploads as well as downloads?
    • additional cost for roaming on to GPRS network?
    • coverage; however as I'm generally using it in metro areas, I'm not going to delve into this much, as all of them generally cover the general metro area.

    Here's a rundown of what I found:


    Telstra has both 3G based and wireless based services, being their Next G Mobile Broadband service and their BigPond Wireless Broadband service respectively. The main advantage Telstra has over their competitors is that their Next G network has the capability to reach peak speeds of 7.2 MBps now, with a view of being able to reach 14.4 MBps in the future (though others are planning this as well). Their coverage is also a fair bit greater when using their Next G network (important in outer suburbs and country areas).

    One look at the pricing however, and I knew both of these options were out.

    For mobile broadband, $59 bucks gets you 200MB per month on their Next G network, plus more for a network locked modem, with excess charged at 25c per MB (or $250 per GB). You can get cheaper plans with less data, or more expensive ones with more data, up to a 3GB plan for $119 a month.

    BigPond wireless broadband has slightly more favourable pricing but is slower and has less coverage, with $49.95 getting you 400MB per month, contracted to 12 months, with excess data charged at 15c per MB). Again, you can go up and down in cost for different speed and data allowance combinations.

    The staff at the local Telstra shop were fairly useless too - they sat me down, browsed to a webpage, and just pointed. That's it. Ask a question, and they'll point somewhere else on the webpage, even if it doesn't answer my question at all. I can use a web browser perfectly fine too, useless idiots.


    Optus has a 3G based service only, which they call wireless broadband (their crappy website doesn't deal well with links - on their home page, click Personal, Internet, Wireless broadband). It has a max speed of 3MBps at the moment, generally it has speeds of 512 KBps to 1.5 MBps.

    Optus kicked off the mobile broadband discount war last month, with an offer of 2GB of data a month (upload and download) at $39.99 per month for 24 months ($49.99 unbundled). Excess data is charged at 15c per MB (or $150 per GB). They also throw in a USB modem for no extra cost if you bundle it and join for 24 months (otherwise it's about $5 more a month). There's a cheaper offer too, but with a data limit of 400MB, it isn't worth it. Other modems are available too, including a wireless router modem and a laptop ExpressCard one.

    The catch here however, is that if you get one of their modems, it remains their property, seemingly even after the 24 month contract. I assume it is also network locked.

    Roaming from their 3G/HSDPA network to their GSM network does not cost extra, and data used on that network is counted towards your monthly limit. You can check your data usage on their website, which is nice.

    They have a 30-day coverage satisfaction guarantee too.

    UPDATE (22/12/2007): Optus has released a plan to counter the Vodafone attack - the 'yes' Everyday plan. It's $49.99 per month for 5 GB per month, excess charged at 15c per MB. Modem rental is still from $5 a month. The contract period can be 12 or 24 months.

    If you bundle it with a home phone/mobile, the above deal becomes $39.99 per month instead and the USB modem rental is free while in the contract period, everything else the same. However, to get this price, you have to be contracted for 24 months, otherwise you're not eligible.

    These plans are available until the 15 Jan 2008.

    It is also worth noting that the updated website now confirms you do not own the modem, even after your contractual period. The modem is rented from Optus, and you need to continue to pay that modem rental fee after you complete your contract. If it was free while in the contract (i.e. bundled for 24 months), then a $5 monthly rental fee applies once the contract finishes.


    This is where it starts getting interesting. They have a 3G-based service called Vodafone Mobile Broadband, with speeds between 600 KBps and 1.5 MBps, peaking at 3.6 MBps, but as it is on their HSDPA network, it has the potential for faster speeds, pending network upgrades. Their coverage is traditionally less than Optus, Telstra and 3, but for metro areas it shouldn't be an issue.

    Until 31st December 2007, they're running an offer where for $39.99 a month, contracted to 24 months, you get 5GB per month (upload and download), plus a free modem/data card which you own at the end (unsure if its network locked). Excess data is charged at 10c per MB (or $100 per GB), and you can track your usage in their app (although this is Windows only, and per computer - there is no central website to log into and check).

    When you're out of the 3G network, roaming on to their GPRS network is seamless and costs no more. They also do international roaming to a variety of countries, but for 1c per KB (i.e. $10 per MB), it's a bit expensive, especially given many overseas countries have free wireless hotspots located everywhere.

    Whirlpool feedback is good so far.

    My impression of their retail stores was pretty good - they know what they're talking about (which surprised me), and were keen to show me the modem, and find out more info for me.

    Their general enquires line is an entirely different matter. I called up their general enquires line (they didn't seem to have a personal sales number), 1300 650 410, and got greeted by their computer phone chick, Lara (yes it has a name). "Tell me in a few words what you're looking for..." I wasn't surprised at the voice recognition system, Telstra and Optus both have it too (And I thought the number systems were bad...). So I played along and said "mobile broadband".

    Lara decided I needed mobile technical support, and despite my protests, insisted I punch in a mobile number. I punched in a random one, and got placed in a queue. But thinking that talking to Lara was better than talking to some incomprehensible Indian call centre guy located on the street (judging from the background noise), I thought I'd give Lara another shot.

    So I hung up and redialed. This time I said "sales". Lara couldn't understand me. So I tried again. Lara thought I was an idiot and decided to coach me in communication. Getting quite annoyed, I tried using a range of expletives in hope it would transfer me to a person, but nope, that rumour was wrong.

    In the end, according to Lara, Vodafone has no sales team. This is pre-sales! This is when you're supposed to be courting me with everything, not screwing me over - you can only do that after you've suckered me into an abusive contractual relationship!


    3, as it name suggests, offers a 3G based mobile broadband service, with download speeds up to 3.6MBps, but generally between 600KBps to 1.5MBps. Upload speeds are around 384 KBps.

    To counter Vodafone's offer, 3 have halved the price of all their mobile broadband plans if you sign up for 24 months. The offer is valid until 15th Jan 2008.

    Just picking one of their plans for comparison (you can go higher or lower for different cost and data usage combinations), for $34.50 you get 3 GB per month (upload and download) for 24 months, with a free modem thrown in (unsure about ownership, but knowing 3, it's probably locked), and excess data charged at 10c per MB (or $100 per GB).

    The odd thing about 3's mobile broadband plans is that it seems that as soon as you drop out of 3's Broadband zone (aka their 3G/HSDPA coverage), you start being slugged $1.65 per MB extra (or $1650 per GB, though you have to be pretty patient to download that much on GPRS). This is a pretty important point, one that they've conveniently hidden in the T & Cs. I guess the reason behind this is because 3 don't actually own a GSM (2G) network, so when you drop out of their 3G network, you have to pay roaming fees to roam on to another network (Telstra?). You can apparently disable roaming though.

    Whirlpool feedback is mixed, most weary of the excess GPRS charges.

    I decided to give 3's sales line a try too, 131 683. Surprisingly, they're still on the automated number system, and not crappy voice recognition. That's a thumbs up in my book, but as soon as I got put in the queue, it was thumbs down. In most phone queues, you get a 15 sec sales burst, then classical or pop music. Not so with 3. You get at least 3 minutes' worth of brainwashing, courtesy of the crappy 3 signature tunes (3's a magic number..., and the shitty 3 times table song - I know my times tables damn it!). The worst thing is that the magic number song is actually catchy...


    These guys offer a wireless broadband service, based on their own network, therefore it's worth checking out the coverage first. Unfortunately, their plans aren't very competitive.

    For $49.95, you get a 512/128 KBps (download/upload) speed connection, with a 1 GB data allowance per month. They do however, shape your connection once you reach that data allowance to 32KBps (upload and download), so no excess charges, just an effectively unusable connection. You can purchase additional usage though at $14.95 per GB. It has no contractual period (until 25th Dec 2007). You do have to pay an extra 129 bucks for the modem (+6 bucks for delivery).

    Again, for more money you can get a faster connection with greater data allowance, and vice-versa.

    Their modem is fairly specialised, so I think it would be unlikely to be usable on other networks, so effectively, it's network locked.


    iBurst operates in a similar fashion to unwired, with a wireless broadband service on their own network. Unlike unwired though, you can only subscribe to their service via resellers, who all set different prices.

    For $49 at the Fat Free Fone Company, you get a 1024/384 KBps (download/upload) speed connection, with 1GB data allowance (download and upload). As with unwired, you get shaped after you reach the allowance, but this time down to a 64 KBps connection, which is slightly more usable. Alternatively, you can choose to be charged 15c per MB for data over the allowance.

    There is a connection fee, as well as a cost for your modem, both of which depend on your contractual length - e.g. for 12 months, connection fee is 49 bucks, and the modem is 239 bucks.

    For more money you can get more data per month (or vice-versa), but the max speed is 1024/384 KBps.

    Again, the modem is fairly specialised, so again, effectively network locked.

    So which one?

    I'm leaning towards the Vodafone plan, because it offers the best value in my opinion. The 2 GB 3 plan for $24.50 a month is still up for consideration though, because I'm still not sure how much data I'll need, although the roaming fee is a bit of a turn off.

    The thing that bugs me most is the contract period - 2 years is a long time in the mobile world, especially with 3 and Vodafone pushing the pricing boundaries, and with Optus and Telstra (but unlikely) playing catchup.

    Signing up for one now locks me in for 24 months (unless I break the contract), until 2010 effectively - who knows what the pricing will be like then? Not to mention speeds, which are on the rise already, with clear plans to go faster and faster.

    The Vodafone offer is still rather tempting though...

    But regardless, hopefully this has been useful to others who are looking around at mobile/wireless broadband around now. Keep an eye on the whirlpool wireless ISPs forum for more info and feedback.

    UPDATE (22/12/2007): Optus has released a catch-up plan, see above. But when compared to the Vodafone deal, it really is not very competitive - the excess charges are 5c higher per MB, bundling is required to get the Vodafone price (actually $1 more per month), and the modem is rented. Apart from arguments about Optus' 3G and GPRS coverage, it really isn't worth it. Nice try Optus, better luck next time.

  • Recovering VMware snapshot after parent changed

    UPDATE (21/05/2010): I've been alerted to the ridiculous amount of comment spam this page has gotten; apologies to those who were further spammed by the email notifications. I have therefore disabled the email and commenting features, and all future comments will be moderated. Damn spammers have to ruin everything, grrrr.

    Scroll down to the problem or solution section below if you want to cut to the chase. 

    I upgraded my Kubuntu installation to Gutsy today - of course, it wasn't as smooth as it should've been. First I had to work out how to do it - the instructions were brief, screenshots confusing, and the process just didn't feel natural. The 'version upgrade' button only appears after you have satisfied certain conditions, conditions that you don't know. It just magically appears when it wants to, after pressing a special sequence of buttons.

    Then the 'distribution upgrade' process crashed, packages won't install. Ended up working after a few tries.

    For some stupid reason, they still haven't fixed the 'failed to set xfermode' bug that heaps of people have encountered and really cripples the system because the system doesn't boot at all. In fact, it removes the fix for it too - adding irqpoll to the end of the kernel line for the appropriate entry in /boot/grub/menu.lst.

    Plus they introduced a new bug by adding tablet settings into /etc/X11/xorg.conf by default, even if no tablet exists, tripping up the system. And did I mention that the network connection is flaky and standby/hibernate still doesn't work? Linux is still Linux it seems.

    Anyway, it all worked out in the end after some googling so I went to install VMware Server on it so I could run my virtual machines on it as well as in Windows. There is no package install available for it, so follow the instructions here, however, use this patch instead.

    Once all that was working, I ran the VMware Console, about to run my Windows Server 2003 Standard Edition virtual machine, when I thought, hmm..., I don't want this VMware instance fudging with the Windows VMware instance, so I'll create a new virtual machine, and link it to the existing virtual hard disk.


    All sounded cool, until I accidentally linked to the base parent hard disk, and not the latest snapshot. So once I booted it, not only did I not have the latest changes, but when I re-linked it to the latest snapshot, it wouldn't boot anymore. Instead I got the error message, "Cannot open the disk ... Reason: The parent virtual disk has been modified since the child was created."

    Did I mention that the virtual machine housed the test instance for this website, including the changes I had been working on all weekend, and I had no other backup? Stick out tongue

    After a few minutes of cursing and swearing, banging on tables, wondering wtf I had done, and pondering redoing all those changes again, I did what every self-respecting nerd does when they're stuck - turn to google.


    I found these links:

    Here is my solution, which is basically a rewrite of the process in the last link above, with a few more details. I used Linux to do the recovery, mainly because it had commands that I needed. I assume you have some Linux command line knowledge, as all this will be performed in the terminal.

    1. Make a copy of the virtual machine folder in case you screw up.
    2. Look at the size of the snapshot virtual hard disk. If it is more than 2GB and you're running a 32-bit OS, or it is more than the amount of memory that you have available, the following method will probably not work. You're welcome to try though.

      The virtual hard disk files all end in .vmdk. The snapshot one has -xxxxxx on the end of the file name, indicating the snapshot number. For example, if my virtual machine was called Windows Server 2003 Standard Edition, my base parent virtual disk will be named Windows Server 2003 Standard Edition.vmdk, and my snapshot may be named Windows Server 2003 Standard Edition-000002.vmdk.
    3. Find out the CID of the base parent virtual hard disk. Because this virtual hard disk will most likely be larger than 2GB, you won't be able to open it in nano, vi etc. As we only need to read from it, we can use a linux command to print out only the first 20 or so lines.
      head --lines=20 {base parent vmdk path}

      Replace {vmdk path} with the path to the base parent virtual hard disk file, e.g.
      head --lines=20 /media/sda1/"Virtual Machines"/"Windows Server 2003 Standard Edition"/"Windows Server 2003 Standard Edition.vmdk"
      The CID is the 8-character random string on the line starting with CID=. Write this down somewhere.
    4. Now open up the snapshot virtual hard disk in a text editor, and change the parentCID (not CID) to the CID you recorded in the previous step. Then save. You can use nano, vi or some other Linux editor, e.g.
      sudo nano {snapshot vmdk path}
      Make sure to sudo the command, and also be patient - it could take a few minutes, during which the console may remain black; it is loading.

      I chose to do this in Windows instead, using Editpad Lite which is amazingly fast.
    5. That's it, your virtual machine should now start up again.

    Further explanation

    If you're interested, here's a deeper look into what you just did. At the beginning of each vmdk file is a disk descriptor section, which contains the properties of that virtual hard disk in text. The CID is a random unique identifier that identifies a particular state of the virtual disk - each time a change is made to the virtual hard disk, the CID changes.

    In normal operation, the CID property of the base parent virtual hard disk is synced with the parentCID property of the snapshot virtual hard disk to show that the two files work together. The snapshot has to work with the base parent to be useful, as it only contains the differences from the base parent virtual hard disk. It is important to note that it is the snapshot's parentCID property that is synced with the base parent's CID property, not just the two CID properties in the virtual hard disks - the two virtual hard disks are in a parent-child relationship.

    When you startup the base parent virtual hard disk on its own however, changes are made to that virtual hard disk without being in sync with the snapshot, so the CIDs no longer match.

    And when the CIDs no longer match, VMware complains because the snapshot is out of sync and the changes in the snapshot may not apply properly to the base parent anymore, possibly resulting in data corruption.

    By forcing the CIDs to match again, you effectively trick VMware into thinking it was never out of sync.

    Depending on how complex your virtual machine is though, it may be worth recreating your virtual machine after recovering your data because it won't be known where the corruption is, if any. If you did anything to the base parent virtual hard disk before realising and shutting down, e.g. copied files around, the risk of corruption is higher.

  • Stop 3 from fudging up your Windows Mobile clock

    My mate recently bought one of the new HTC TyTN II phones from 3 (aka the HTC Kaiser), and has been raving on about how awesome it is - he's never had a Windows Mobile phone before. Meanwhile, I'm beginning to pick up on all the annoying bits in it and it's driving me up the wall - which idiot thought the search feature should be an afterthought, and why doesn't it let me pick a different number for that person on the spot when I call someone and they don't answer!

    The phone is quite well done though - I like how they even made the screen tiltable when in keyboard mode. I still have some reservations about the solidity of that sliding mechanism, plus I've been told that when in GPS mode, you can't make/answer calls on the loudspeaker - the GPS software just stops, and the loudspeaker and microphone drops out occasionally. So sounds like there's some kinks that need to be fixed - 3's QC levels need a bit of improvement.

    Anyway, after he surprisingly managed to get ActiveSync and Outlook working on his computer, he ran into a problem. His calendar kept shifting his appointments a hour later, and every time he tried to fix it, it would work for a while, then randomly, it'll shift all the appointments again. For someone who had already become fairly dependent on a PDA, it was driving him nuts, and understandably too as he no longer know when was what, meaning his calendar was useless. Worse, his Outlook was buggered too.

    So we went through the standard troubleshooting steps - checking time zones on his phone and PC, and whether daylight saving was kicking in. It turned out whenever the appointments were shifted, his timezone had been reset from GMT+10 Sydney to GMT+10 Vladivostok (a Russian city I now know). And if he changed it back, it would stay that way, until some random time and it'll change back to Vladivostok. Windows Mobile (and Outlook) automatically shift your appointments for you when you change timezones so they'll be right in your new timezone - useful feature, and isn't the cause of the issue, but a side-effect.

    A bit of googling told me that this was a common issue on 3 and Windows Mobile (WM5 or WM6) phones. It has to do with those that have the automatically update timezone option installed. This option is a provider option (i.e. one that providers choose to enable on their phones), although I think it will soon be a standard one. The idea is that your phone will automatically switch your timezone whenever you enter a different one, so you don't have to fudge around with changing times etc.

    The implementation of this 'automatically update timezone' functionality is somewhat flawed on Windows Mobile. It doesn't seem to be able to pick the right timezone. I'm suspecting that the signal broadcasted from the provider only tells the phone what the UTC offset is, e.g. GMT+10, so when Windows Mobile receives that signal, it scans through the list and sets the current timezone to each of the matching timezones it encounters. It probably stopped at Vladivostok because it is the last GMT+10 timezone in the list.

    Now daylight saving in Vladivostok starts in March (becomes GMT+11) and ends in October (reverts to GMT+10), which is exactly opposite to us - daylight saving for us starts in October (becomes GMT+11), and ends in March (becomes GMT+10).

    So assuming it was last week when daylight saving had not kicked in for us, if I had entered an appointment at 9AM whilst in the GMT+10 Sydney timezone, when Windows Mobile changes to the GMT+10 Vladivostok timezone, it will change my appointment to 10AM, as Vladivostok is in daylight saving (becomes GMT+11), and we're not (GMT+10).

    If this happened this week, when daylight saving kicked in for us, and was reverted in Vladivostok, our appointment times would instead be pushed back 1 hour, as it'll be GMT+11 for us, and GMT+10 for Vladivostok.

    Therefore although the GMT/UTC offset is the same, because the daylight saving periods are different in Vladivostok than in Sydney/Melbourne, the two timezones are not interchangable.

    Anyway, enough explanation. The solution is simply to disable the automatically update timezone option, as it is unreliable. To do this,

    • Go to Start -> Settings.
    • Make sure you're in the Personal tab (down the bottom).
    • Select Phone.
    • Click the right arrow at the bottom right corner to scroll the tabs, until you see one titled Timezone. Click that tab.
    • Untick the box there for the Automatically update timezone option (should be the only one there I believe).
    • Tap OK in the top right corner.
    • Make sure your current timezone and time settings are right.
      • Whilst in the Settings screen, go to the System tab.
      • Tap the Clock & Alarms icon.
      • If you're not on the Time tab, select that.
      • Make sure your timezone is GMT+10 Sydney (or whatever it should be), and that the time and date is right.
      • Tap OK to save. 

    And that's it - while your phone will not automatically change times anymore, at least it won't fudge with your calendar now.

    Now let the fun begin when daylight saving periods are re-aligned next year in Australia, ending on April 6th, and starting on October 5th instead Smile I'm predicting an update will come out from Microsoft soon, but it'll be a manual thing because no automatic update mechanism exists. Hopefully by then HTC/3 will have fixed this automatically update timezone feature so that it actually works too.

    Ah, isn't daylight saving great Stick out tongue