I feel like I'm the last developer in the world who is dipping his toes into XML. So what I have to say about it is probably useless to the rest of you, but here are my latest experiences anyway.
So how's this for not falling prey to speculative generality or premature abstraction: I don't like to make tools until a clear need is demonstrated, when you can tell that a content creator is going to be a bottleneck and they need workflow improvements. On Schizoid and through much of our current project code was always the bottleneck. (Actually, just recently it may have become art. Time to make some art tools.)
To that end, on Schizoid and through much of this project we've had content creators do their creation directly in the code base, editing .h files (well, with Schizoid, C# files) and data structure declarations, piles of curly braces and all.
That all changed recently because Richard's machine had trouble installing Visual Studio, so I thought it was time to do the right thing and put our data in text files. Not to mention, we're eventually going to have to, for Downloadable Content purposes. It's been many years since I've written code to read data from a text file. My MO has always been to use fscanf.
So, time to catch up with the new millennium. Do I really want XML, for starters. There was a whole XML - JSON thing going around a while back - my friend Bryan McNett was a big JSON advocate, and he did really cool stuff in the Spidey 3 codebase where you could tag data structures in your h files and a text processor would create the JSON importing code for you.
He's written articles about it but I can't seem to find them.
In the end, I decided to try XML rather than JSON, because there seemed to be more support (XML handily won the Googlefight), but the deciding factor was being able to easily load XML files into Excel. Having all of our unit data in one big table that's nicely formatted (the alternating dark blue and light blue rows soothe my soul) won me over. And McNett's .h file markups may have been cool, but we don't have enough different data structures to make the effort of adding another step to our build process worth it. (Not to mention there's no reason you couldn't theoretically do the same thing with xml.)
The next question becomes - which XML library do I use with C++? With C# it's built in, yo. Again I mostly relied on Google - when in doubt, do what's popular. The second hit was TinyXML, which seemed great. Free license and small. Hooking it up and getting started was fairly painless. I usually felt like it took 2-3 lines of code to do what should take 1, but some small wrappers and I was ok.
It took me about a day and a half to learn pidgin xml, hook up the library, and get our data converted. I tried creating a schema to assist in the XML importing to Excel and threw up my hands - it seems to do a fine job on its own, anyway. We are schemaless. A schema, I know, would help with data validation. We do some in code, but not enough, yet.
That's one of the big things I miss about the old way - typing our data straight into the code - is that the compiler validates our data for us, and does a great job.
One thing I didn't like about the old way was simply formatting the file - you end up with a lot of comma separation and it's easy to lose track of: is this float here his hitpoints or his armor? Etc. The new files are much easier to read and maintain.
What about XML vs. rolling our own format? This also feels like a win. Not a huge win--here I am writing a lot of XML glue code instead of a parser--but a material one.
And so, I've finally caught up with the new millenium.
Recent Comments