Upgrading the DirectX SDK

When I joined Bitsquid a month ago, someone mentioned they wanted to upgrade the DirectX SDK to get some improvements, but that there was a dependency in the way. I was foolish, and volunteered to investigate. Over the past ten days or so I have untangled the whole mess, leading to a successful upgrade. I now want to share my findings so the next unfortunate soul can save some time.

Step 1: Explore

First stop: MSDN's article. I had heard that the DirectX SDK was now included in the Windows SDK, but I wasn't sure what that covered. This article sums it up. With a teammate, we went through the whole list, figuring out what we were and were not using. In the end, the only problematic components were XInput, XAudio2, and D3DX9Mesh. The bulk of the codebase had already been converted away from using D3DX, which was great!

However another thing needed clearing up. Our minspec is still Windows 7. How was that going to work? Luckily, MSDN had the answer again. This article reveals that the Windows 8.X SDK is available on Windows 7. This is covered in more details on this page and that page.

Step 2: Well let's just try then

I changed the paths in our project generation files to the Windows SDK. I also added the June 2010 SDK, but only for XAudio2 and D3DX9Mesh (more on XInput further down). After fixing only a few compile errors, things seemed mostly fine... until I got a runtime crash about ID3D11ShaderReflection. Huh?

Step 3: GUIDs and the magic #define

I had wrongly assumed that the link errors I had been seeing when changing the paths were caused by DX9, because I read too fast. Linking with the old dxguid.lib made the errors go away, so I didn't think about it more. However, a large part of DirectX relies on GUIDs, unique hardcoded identifiers. When debugging, I noticed that IID_ID3D11ShaderReflection had the wrong value compared to the Windows SDK header, which was causing the crash. I went on a goose hunt for what was somehow changing this value, and wasted a day to looking for a wrongly included file.

But by default, those GUIDs are extern variables, and will get their values from lib files. And I was linking with an old one. Mystery solved! I removed dxguid.lib from the linker, but that of course caused the GUIDs to be undefined. The solution for that is to #define INITGUID before including windows.h. Thanks to the Ogre3D forums for pointing me towards the relevant support page, since they encountered the same issue before. At this point everything was fine, except that it was failing on the build machines.

Step 4: d3dcompiler

The first error had been around for a long time. We had so far, unknowingly, relied on the d3dcompiler DLL being present in System32! Since System32 is part of the default DLL search path, this is easy to overlook, especially when the DirectX SDK is a required install anyway. We were now relying on a more recent version, supposed to be included in the Windows SDK. Yet still it was failing... because we did not have a proper installation step. I tweaked the project files again, adding a copy step for that DLL. CI, however, was still failing.

Step 5: XInput

XInput comes in several versions in the Windows SDK. 1.4 is the most recent one as I'm writing this, and is Windows 8-only. To use XInput on Windows 7, you need to use version 9.1.0. For that, ensure that the magic _WIN32_WINNT #define is set to the proper value (see further up on the page). You also need to explicitly link with XInput9_1_0.lib and not XInput.lib, or Windows 7 will get a runtime crash trying to fetch XInput1_4.dll, which doesn't exist on Windows 7. In my case this was breaking the automated tests on a Windows 7 machine, but was completely fine on my Windows 8 workstation.

Step 6: Profit?

As far as I can tell this should be the end of it, but the rendering team has yet to stress-test it. We'll see what breaks as they poke around :)

Hopefully this can save you some time if you're doing a similar upgrade, or convince you to give it a try if you've been holding back.

UEFI rimes with misery

(Yes, it does rime, if you pronounce the acronym the french way :D)

TL; DR: Don't try to create a bootable USB for Windows by fiddling around with diskpart. Get a USB disk drive instead, or use proper tools. If your Windows boot breaks, have a DVD on hand, go to repair mode and run the automated repair up to three times. If it says "not compatible", try running the DVD as UEFI from your BIOS. Have up-to-date backups and a Linux live USB on hand at all times. And dual-boot is apparently easier than the Internet seems to think.

Let's start with a quick timeline of the events of late last week:

  • Received the last parts for my desktop computer after a post hiccup.
  • Put the Beast together. It's surprisingly easy nowadays! Just don't forget to plug in the CPU fan. Just sayin'.
  • Find out that Windows 8.1 was 300SEK cheaper as an OEM disk from Webhallen.
  • Decide that surely it won't be that hard to create a bootable USB from that.

WRONG. It turns out that the OEM disk does not allow you to "Install to media" from the Windows 8 setup, like a retail version does. Unfortunately the tutorial I found suggested doing some blood magic with diskpart. I created an ISO, followed the commands, thinking I was doing things right. But some process was holding a handle on a file from the ISO (?), so I just went into ProcessExplorer, closed the handle and kept going. Yes, this should have set off a few alarm bells. I was still getting "File in use" errors, so I restarted...

Windows won't boot

Well, this is interesting.

Yes, you got this right, I managed to fry my laptop's boot by trying to make the USB to install my desktop. "Restart and select proper Boot device". That didn't sound very good. Was this "BCD" thing important, then...?

Cutting to the chase, of course it was. BCD stands for "Boot Configuration Data", and it's a vital file if you want your Windows to start at all. I sighed, dug up the Linux Mint USB stick I had already made, fired up my Macbook Air for some googling, and settled in for a very annoying Sunday afternoon. Apologies to my twitter followers for all the cursing, by the way :)

First step was getting it to boot again on Mint, to make sure everything was alright apart from the boot itself. I had made the USB previously for the desktop. This was completely painless: download the ISO from the website (I went for Cinnamon), use a tool to create a bootable USB (I used Universal USB Installer), and there you go. You might have to iterate on the formatting somewhat (NTFS or FAT32). Then start up the computer, enter the BIOS, and set the USB as the first boot device. I had to switch between the UEFI and the non-UEFI version to get it to start, wondering what that new acronym meant.

I got it to start up under Mint, checked my drives: relief, everything was still there. Note: at that point I kept trying to fix things without backing up first. This is a terrible idea and you should not do the same, I was lucky to not lose anything. Make a backup before you fiddle with your partition table! Also, everything that follows is made much more interesting by the fact I did not have a Windows 7 DVD handy.

Looking into GParted was quite educational: I saw that my laptop in fact has five partitions. "System", hidden (no letter assigned) and quite small. Another partition that GParted couldn't identify. Then "OS", my C: drive. "Data", my D: drive. And "Recovery", hidden as well. It was the first time I saw the naming schemes for those: sda for the hard drive, sda1 to sda5 for the paritions, sdb for the USB drive I had plugged in.

First thing I had vague memories of was the MBR (Master Boot Record). I couldn't explain to you exactly what it is, and frankly I didn't want to dig too deep. But basically the first possible reason for your Windows not to boot is that the installation is not listed in the MBR anymore. The easy way to fix this is by plopping in your Windows recovery disk, but as mentioned I didn't have one, so I tried doing things the hard way. Googling found this tutorial, which I followed religiously, with a ridiculous number of reboots in the process. It didn't help though. Since my Windows was still listed in the available boot devices, just failing to start, I started to suspect it was more serious than a broken MBR.

I then found out about BootRepair (thanks to Jeroen Baert for the tip). It's a Ubuntu tool but since Mint is a Ubuntu derivative, it's also available there. You have to add a new APT repository to get it, just follow the instructions on the page. Then you click the big friendly button, let it work its magic...

... Still no luck. Next step was Super GRUB Disk. I created a bootable USB from that, picked the right boot device, and was able to start it up. This USB I created from my Mac, using Mac Linux USB Loader. I think I had to format the USB to FAT32 beforehand from Disk Utilities. Once SGD starts it will list the available boots on your machine more extensively than your BIOS. There I saw the three Windows boots available: System, OS, Recovery. I tried to start each of them and got the same error message each time: it was failing to find the BCD file. The message went something like this:

File: /Boot/BCD
Status: 0xc000000f
Info: an error occurred while attempting to read the boot configuration data.

Throughout the previous googling, I had encountered "BCD" and "UEFI" more and more often. I looked into it a bit more and figured out that my laptop was a complicated case because of UEFI. I do not know the details (and don't want to know them), but it's some kind of more modern BIOS replacement which, for Windows, is coupled with some secure boot mechanism. It's the reason for the four-partition setup... and it makes fixing anything more complicated. I did not investigate the exact details, but apparently with BIOS you just have a blob at the very start of the disk that then points directly to the boot files. With UEFI you have this intermediary System partition making everything messier.

At this point it was becoming obvious that I needed a Windows recovery up and running if I was to get anywhere. I hunted down a Windows 7 ISO (which is completely legal to download), and created yet another bootable USB. I managed to boot from the USB after randomly toggling between UEFI and non-UEFI... somehow UEFI wasn't always the right one! "Repair my system", and it complained that the Windows I was trying to repair wasn't compatible. Darn.

This, however, does not prevent you from opening a command prompt. Hit Shift-F10 to get one. From there, I followed this tutorial. Solution one was a no-go due to the compatibility failure. Solution two did not work. Solution three, "Nuclear Holocaust", started throwing "The requested system device cannot be found" errors at me halfway through. Googling some more, it seems that the procedure will not behave properly unless you're running from a DVD. I cursed some more (sorry again, Twitter), turned the whole thing off for the night and used my Mac instead.

Luckily, I work for a tech company and not everybody's on vacation, so I was pretty optimistic about my odds of finding a Win7 DVD from a colleague. Daniel Collin came through with flying colors, along with a USB DVD reader that was going to allow me to setup my desktop. After finally completing a backup (don't wait like I did! Do it from the start!), I plopped in the disk on Tuesday and attempted to repair my system.

"Incompatible" again. Damn. I restarted, went back to the BIOS, and made sure to start the DVD as UEFI. And this time I was finally able to access the recovery mode! I ran the automated repair, let it run, restarted, booted up recovery again, ran the repair a second time, restarted... It's actually recommended to run it three times, but the third time couldn't find any issues anymore. So I rebooted, tweaked the devices order in the BIOS...

I was never so happy to see the Windows start screen.

Conclusion: This is a bit of an underwhelming finish, because it was automated repair magic that fixed it and not anything I was doing myself. My guess is that it was able to do what I tried to do with bcdedit but could not because of the error.

Bonus: Win8 + Mint dual boot on the desktop!

Since I wasn't done having fun with partitions, MBRs and other things which make me praise the sky I'm a software engineer and not a sysadmin, I wanted to fix up a dual boot on my brand new desktop. I used the USB disk drive to install Windows 8, which went swimmingly if you ignore the creepy default settings. I then picked up my faithful Mint live USB again, plopped it in, installed, rebooted... Nothing but Windows. Seems that by default it does not install in a way that makes you boot to GRUB. I found this wonderfully objective tutorial which explained a thing or two, and decided I was just going to be lazy and run BootRepair. I booted the live USB again, installed the package and ran it. The only problem I had while running the commands it suggested was a missing dependency (linux-headers-something): I just took the apt-get line that was failing, changed the package name to the missing dependency, ran that, then the original command, and it proceeded just fine. The final window said there were errors, but still advised rebooting, and so I did. I was greeted by GRUB asking me to pick an OS to boot, and everything seems to work fine. So, yay for BootRepair, and thanks to UEFI for forcing me to use magic tools instead of understanding what's happening.

Do note I did not have to disable SecureBoot at any point. Your mileage may vary, but it seems it's not necessary anymore and many tutorials on the web are out of date.

Second bonus: As a reward for making it through the whole post, here's a chocolate cake recipe I promised to write up.

Ingredients: 200g of dark baking chocolate, 100g of butter (preferably with salt in it), 50g of flour, 100g of sugar, three whole eggs (well, without the shell). Do not use milk chocolate or it will be too liquid.

Heat up oven at 220°C. Melt the chocolate and the butter together in a bain-marie, mixing slowly so that it gets a nice silky texture. While it's melting, prepare the flour, sugar and eggs in a mixing bowl, without mixing just yet. Add the chocolate and butter and mix quickly (or your eggs will turn to omelette from the heat...) until the whole thing is homogenous. Pour in a metal or Pyrex-like mold (one of them oval glass things for chicken is perfect): it should "feel" homogenous, neither too liquid not too solid, and make nice ribbons while you pour it. Then put in the oven for 10 minutes. It's perfect when you can stab it with a knife and have said knife come back with just a little chocolate on it. Leftovers can be stuck in the fridge for a very buttery texture that's awfully heavy but delicious too, though of course it's better warm.

Updating Wordpress: localization woes

Today, I decided to update WordPress. On related news, you might find a lot of curses in my tweets today. I'm writing up my findings in case it helps other unfortunate souls.

WordPress localization seems to work by hardcoding english strings in the source code, listing them in a .pot file with the source file names and line numbers, which is then overriden by a .mo and .po file. The .mo file seems to be some kind of binarish format, the .po is similar to the .pot but with translations.

So what's the problem with this? I'm using the Twenty Eleven theme, because it's simple, it was built-in and I like it. Sounds reasonnable enough. However, its localization files used to be in the WordPress core. This was (rightfully) deemed unsatisfactory at some point, and they were moved out into a languages directory inside the theme itself... for the more recent themes. Which means that the older Twenty Eleven, the files are removed from the core but not added back. Result: the french version of my website still says "Posted on", verbatim.

One solution is to go find language files, but my google digging did not yield anything for this theme. There are files in the older french wordpress core, but they're not compatible. Since the line numbers are baked in that's not really a surprise! I tried to update the line numbers in the .po file but it still didn't work.

In the end, I decided to live with the somewhat crappy mix. I do not have the energy to keep digging to bypass clunky design, and upgrading my whole theme just for that is a bit overboard.

Another issue when updating is that my translation plugin, qTranslate, was not compatible anymore. It seems to be abandoned, but luckily mqTranslate, a fork with team features, has picked up the banner. It includes a database conversion function and the transition went pretty smoothly overall! I just had to enable French again and tell it to use the overriden date formats.

x86 on a 64-bit system with SCons and MSVC

As stated previously, I’m trying to get into DirectX 11. For a change I’m not using Visual Studio, but Sublime Text and Scons, with the MSVC compiler because of the DX11 header incompatibility with MinGW.

Link time, d3d11.lib et al... Link error. Unresolved symbol for D3D11CreateDeviceAndSwapChain. What? I did set the library path (to the x86 binaries) and linked to the lib file...

Sure. But by default, Scons uses the MSVC version matching your system: 64-bit for me. The solution is to add TARGET_ARCH='x86' in the Environment constructor call. Adding it afterwards by using Append doesn’t work.

Sublime Text 2 and SCons

Well, might as well post it somewhere in case it helps someone and so as not to forget it.

To the point. A friend of mine (a demoscener known as xtrium, already mentionned here) is working on a 3D engine, and as I'll lend a hand I needed to build the project. He uses SConscript, I wanted to get started with Sublime Text and not have to use the "wonderful" Windows command line. So, custom build in Sublime Text.

To anyone saying, "just install Linux", I'm more comfortable with Windows and love my usual software. So no penguin here ;)

Anyway, here is the sublime-build file I added in the Packages/Users directory to get a SCons build system. The encoding might not work on your system, I got it to work on my machine and did not look further.

{
    "cmd": ["scons.bat"],
    "file_regex": "^(..[^:]*):([0-9]+):?([0-9]+)?:? (.*)$",
    "working_dir": "${project_path:${folder}}",
    "encoding": "cp1252"
}

The first line calls the SCons executable, it assumes that the containing directory is in your path.
The second one allows you to go to the error line in the source file when clicking on the error message.
This one works for GCC (MinGW at least), I've posted one for Visual C++ below.
Working dir sets the... working directory - no kidding - and assumes you have the SConscripts in the Sublime Text project directory.
And the encoding part removes the [Decode error - output is not utf-8] error, at least on my machine (French Win7 64). No guarantee about yours.

Hope it helps someone!

Edit: Due to the DX11 headers not compiling with GCC, I had to switch my SCons environment back to Visual C++. Of course that changes the error message format. Here's the (maybe not very rigid) regular expression I used instead:

"file_regex": "^([^(]+)\\(([0-9]+)\\)() : (.*)$"