Thoughts on Oracle v Google stuff

Or more specifically after parsing this, thanks Noles ;).

Personally, I think under that context, Oracle will likely win.  I do not believe that a language /should/ be copyrighted but that they technically can be, think about how the types involved might mix for various copyrighted works.

I’ll be the first to admit that our system for copyright, patents, intellectual property, trademarks, and the like is a maze with more than a few turns just full of bullshit. But let’s think a moment: what is it really about? Money. It’s not about fostering innovation (patents) or controlling your property (oi). It’s about money. That’s it, simple.

Java is a product and a creative work, sufficient to be copyrighted. So is the GNU Compiler Collection and that last book you read.

What is the jist of copyright? Wikipedia as of this writing, defines it as a sub class of Intellectual Property that is generally “the right to copy, but also gives the copyright holder the right to be credited for the work, to determine who may adapt the work to other forms, who may perform the work, who may financially benefit from it, and other related rights”

Java, as it applies to Android, is not very different than any other language applied to other systems. The devil is in the details as they say. An Android application is a collection of Dalvik bytecode, native code, and resources running under the Dalvik virtual machine: and Android provides a runtime.

The implementation is not “Java” as Oracle ships it. In fact, as Microsoft’s various efforts to make a .NET dialect of C++ and projects like JRuby confirm: you can have a fair bit of abstraction between *concept* and implementation. Android developers typically write code in Java to an interface documented in Java. They could just as easily write in any language you can eventually wrangle into Dalvik bytecode! Android applications can and have been written in other JVM languages, and non JVM languages. The interface, well hell, many things in the .NET world are done in C# but you could just as easily use Visual Basic or F#. Really helps to be able to read C# though. Just like how on many systems, it helps to be able to read C and or C++.

That runtime part that Android applications depend on is quite “Java like”. Many intrinsic components are provided. C programmer’s should think of the stdio library. Because that is the sort of thing that has been “Copied” from “Java”. Essential programming interfaces, not implementations but interfaces (as far as Oracle holds right to). GNU implements C’s I/O library in their runtime. So does Microsoft in their own. They didn’t have to supply crap like printf() and puts(), they could’ve supplied their own pf() and IoConsolePutLStr() functions! Nether group owes the other jack shit over that. But hey, printf() and puts() are what are widely used: even in other languages!!!!

A lot of things in Androids runtime are also unique. For example, the parts that make it Android specific rather than able to compile under Oracles development kits for PC. The implementation is not copied but the conceptual interface, yes.

So that’s a problemo, how far does that level of control and ownership apply to derivatives? And what actually constitutes a derivative work?

Is copying the script of a movie scene for scene, line for line, and reshooting it for your own release and profit, an issue? Yeah. Obvious. Is doing a movie about a train, going to allow whoever owns copyright on some other movie with a train, to sue your ass for it? It shouldn’t unless it’s close enough to the former, or similarly having a legal problem of some other sort.

It’s more of a question like, should Matz and Oracle be able to sue the developers of JRuby for copyright infridgement: because it provides an even stronger resemblance to both Ruby’s and Java’s programming interfaces than Android’s runtime does to Java’s. Things like C, C++, C#, Common Lisp, Scheme, and EmcaScript are formally standardized to some extent. Things like Java, Python, Perl, Ruby, and Lua are not. Could Niklaus Wirth (or Apple) have sued Boreland over Delphi?

I do not feel that it is responsible to exercise such strong-arm aggression against users. It’s bad for Java, it’s bad for business, it’s bad for the continuing evolution of the industry, and it’s bad for those who have already invested.

And as far as I am concerned, enough programming languages “Borrow” stuff that applications of copyright the way Oracle must be seeking, or not feasible—and may very well fuck up language development for the next decade. Now we have to worry what the fuck we name our classes? What next, companies are going to start exerting control over _our_ works made with their tools?

Thanks Oracle, hope your stock plummets and your employees find good jobs with businesses that offer better job security.

Having glanced at news of the new Kelper card running the Smaritan demo, a job that originally took a trio of 580 GTX cards (can you say expensive and smokin’ hot?). I couldn’t help but wonder, just how much optimization work may have been invested between NVIDA and Epic Games, to get that monster running on the new card.

Well, turns out from the sound of it, that it probably wasn’t quite as bad as I thought it would be or not in the ways I expected it would be. The new FXAA (Fast Approximate Anti Aliasing) technique being one such “Unexpected”. The fairly simple write up in the article hints to normal people, why the amount of video memory on graphics cards have been going up so much. Damn, I remember when 32M was just starting to fade away, and when 256M was the going norm’ alas, times change.
Anti-aliasing is one of those things, people see in their games video settings (usually) but don’t understand. They only can summise it must be “Better” the higher the number gets and that e.g. 16x AA must be better than 0. Ditto for issues like Anisotropic, trilinear, and bilinear filtering. It really isn’t hard to understand. Further complicating the life of a gamer,  some games let you choose various types of Anti Aliasing (such as MSAA or CSAA).
Anti-aliasing makes things look “Better” in simple terms. Let’s just say that aliasing is a distortion of the image. Something like this: 
The above picture taken from Wikipedia shows the effects of anti-aliasing on the right. SImply put, it makes things better. In some games it can be a lot more noticeable than others, Raven Shield or Call of Duty for example might benefit from AA a lot better than Sonic the Hedgehog or Pong.
The problem is it is expensive. Imagine, you can make things appear less jagged but you must pay a painter to smooth out the image. Now imagine, not only do you have to do this not only for every Frame Per Second of game play, but also for every pixel on screen. On my computer that is 1,920 x 1,080 = 2,073,600 pixels. Most games need at least 25-35 frames per second to be playable smoothly and 50~60 can be noticeably better.
The artist is using your CPU and GPU (graphics card) to do all this work. Because the screen works with “Pixels”, little dots, it can be a hell of a lot of work but hey, it’s easier to smooth out your neighbors than using a paint brush.
 
Techniques for doing all this shit have varied with time but in essence, they can vastly change the work load on your computer. One way of thinking it, if you can play a game fine in 800×600 with AA, you might be doing more work than playing it in 3200×2400 without it! This site has a great and simple write up about various techniques. 
In general a little anti aliasing is good but you are not likely to notice turning it all the way up to 16x FSAA, give or take your computer might cry.
FSAA should probably be avoided unless you want to boast about your hardware or publish screen shots that look good.
MSAA is close enough that for how much less work it takes to do, it’s worth while. 
CSAA or CFAA are available on most decent graphics cards since 2006, and is worth consideration. Think of it as similar to MSAA in the way MSAA is to FSAA. If you have a fairly dark game like SWAT 4 or Modern Warfare 2, as opposed to a very vibrant and colorful game, CSAA is probably worth it.
If you have a lower end graphics card, try using CSAA. If you can _actually_ tell the difference or just want to flaunt your cash, try MSAA.
In most cases, it is “O.K.” to use like 2x or 4x AA in a game now. It’s not like the old days when the computers just couldn’t handle it so much without better hardware. By the time you reach and exceed like 8x (MS)AA, you are probably not going to notice the difference all that much, seriously. If the difference between 16x MSAA and 8x MSAA is so easily seen on your screen and it bugs you, you probably should get a top end graphics card.
How to tell what you should use? Welp, just try a value like 4x Anti Aliasing and see how it impacts your game. If your computer can’t handle it, try tuning it down a bit (2x or off). If you can run it fine, hey, feel free to see if you can go up a notch or two.
A little secret though: no one gives a shit as long as the game looks good and plays good. So as long as you don’t turn the setting to high, it’s all good.

Thoughts on “Why Devs hate PC Gamers”

Was reading this article today and it basically laments that game developers gate PC gamers because of the piracy. While I can’t agree with most of the remarks about DRM, even less so being a programmer. I’ve written a bit in the past about how I feel about DRM.

What I do generally agree with though, is this persons views on pirates. I don’t pirate crap. I don’t even mind paying for crap if it’s worth having. Most games I can’t say are worth the release price, so I don’t buy them. Personally, the only distinction that I see between console and PC for piracy is that it is harder for the technically innept masses to steal. As the tech-line blurs and development becomes a bit more related, that’s going to change. Just look at the current generation of consoles versus say, the original Nintendo.

Another day closer to BattleTech, another Trillion away from Avatar

My post title sums up exactly what I think of this BBC article on how controlling stuff with thought is going. It really is getting impressive, if abit, unlikely that even our childrens grand children will see much more of it than we already have of cloning. The applications for gaming, may invoke memories of Marty McFly being told his old rails shooter was a “Babies toy” for needing the player to use their hands, or at least for people getting on in years a bit <_<. But in all seriousness, until such tech is literally prevasive as game controlers, it’s not going to happen, except maybe for the super rich (JP anyone?). Expanding the capabilities of UAVs and other war machines are probably going to be the only thing to cause sufficent surge in funding for such tech to develop massively; which is sad, IMHO, when letting crippled people walk again is a much better cause. Even more so when given the cost, it’s less justisifable of a Big Fraking Budget for research, then improving existing means of remote control. I have no doubt however, that the armies model for the Next Gen solider will likely incorporate that level of being “Plugged” in as well, in all due time. I seriously hope though, that a company called Cyberdyne with a suit named “Hal”, is some Japanese geeks idea of good humour, and that Skynet is still brewing somewhere beneath Moutain View, Californa. For those not in the know, in the world of BattleTech, “Mech Warriors” have their physical controls augmented by training their ‘mech to react to their brain waves; For those living under a rock, Avatar displays remote-mind-control over a syntehic body (so does Surrogate for those that don’t do enough Sci-Fi), and Skynet is software that will eventually become self aware and take over the world when it sees how threatening we are to it’s continued existence. Three Laws of Robotics, my hairy humanoid ass!

In addition to being sad, this article reminds me that a part of programming, is you will likely read several implementations of fundimental data structures, and write one or two yourself. If you can’t figure out how a data structure works (TM). You probably should NOT be using the darn thing.

+1 for git in my book

Parsing this old blog post by Keith Packard, somehow makes me happy that I opted to follow the Git path rather than the Mercurial (hg) path when I left CVS behind. I also like the trailing comment xD.

That being said, git and hg are the only version control systems I really respect, above tar. Perforce and Darcs, I at least consider worth a closer look someday. Just haven’t had any excuse to leave Git on my own projects since the first date.

Yeah, I’m a lazy git alright.

Epic Toast Incident

Kristen Gates’ pick for this mornings You Choose The News has to be the best report since sliced bread:

TWITTER HEADQUARTERS EVACUATED OVER BURNED TOAST

Twitter’s main office in San Francisco had to be evacuated when an employee’s burned toast set off the fire alarm. As expected, some employees talked about the toast incident on Twitter. One employee wrote, “Quick math: 30 min fire alarm, 400 employees ~ $2500 for burnt toast. ooops,” while another lamented, “Nothing like a Friday building evacuation in the rain. Damn toast.”

Wow, you really can spend $2500 on one piece of toast!

I’ve been reading a rather worrisome article, talking about a proposal for an Internet treaty. Can’t say the same for my fellow netizens, but that is a damn scary idea! Politics and the control of the ‘net, should have no association IMHO. One might be able to argue that politics is an essential part of every aspect of our existence, whether we accept it or not, while I think it’s more so a crock.

The older I get, the more the good things fade away into the history books.