Thursday, July 24, 2008

Reset MYSQL password

If you forgot your mysql password and you have root access in your linux box, just follow this steps.

1. Stop MYSQL service

# /etc/init.d/mysql stop


# service mysqld stop

2. Start MYSQLl without password.

# mysqld_safe –skip-grant-tables &

and the output is:


[1] 5988

then press enter.

3. Connect to mysql server using MYSQL client.

# mysql -u root


Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 1 to server version: 4.1.15-Debian_1-log

Type 'help;' or '\h' for help. Type '\c' to clear the buffer.


4. Create a new MYSQL password.

mysql> use mysql;

mysql> update user set password=PASSWORD("NEW-ROOT-PASSWORD") where User='root';

mysql> flush privileges;

mysql> quit

5. Restart MYSQL service

# /etc/init.d/mysql restart


# service mysqld restart

Guide to faster Ubuntu

Optimize ubuntu boot sequence by profiling it

  • While you are in grub menu highlight your preferred kernel version and press "e".
  • Highlight the line beginning with kernel and press "e" again. Press End key in order to goto end of that line.
  • Add a word "profile" without quotes to that line and press Enter.
  • Now press "b" to continue booting.
This one time special boot may take more time than ordinary boot. But during this boot ubuntu monitors file usage and preloads those files during subsequent boots.


Any executable that makes heavy use of shared libraries can benefit from prelinking. Prelinking resolves addresses of shared libraries in advance this reduces number of relocations.

Prelink is also useful in the context of security since we can tell prelink to make libraries load at random addresses until next run of prelink. This is useful since libraries won't load at fixed addresses on every system.

You can install prelink by issuing following command.

sudo apt-get -y install prelink
Change a line inside the configuration file /etc/default/prelink from
We will do our first prelinking by executing following command
sudo /etc/cron.daily/prelink

Thats it you don't have to do anything else. Prelink daemon will run periodically to optimize your newly installed executables.


Preload is a little application that monitors files of frequently used applications and and loads them in to the memory when system is idle. This usually results in lesser startup times for those applications. Install preload by executing following command.

sudo apt-get -y install preload

deborphan: Find orphan packages

After doing several installs and removes apt leaves lot of packages that are not needed anymore. You can find these packages using deborphan. Install deborphan by issuing following command.
sudo apt-get -y install deborphan
Now to see the list of packages which are not needed anymore just run following command.
To remove these packages give following command.
 sudo apt-get remove `deborphan`
To get full list of packages that are not essential for functioning of the system execute the following command.
deborphan --guess-all
This command will list data,dev and many other types packages that are not essential for system functioning. Please see man page of deborphan to find out more options.


Remove locale information(language tables) of the applications that are not needed for you. localepurge maintains specified locales and removes the remaining. To install localepurge issue following command.
sudo apt-get -y install localepurge
During installation localepurge will ask you to select locales that you want to preserve. At this step select what you need and press OK. From now on localepurge will maintain only those locales which you selected during configuration step.

from ubuntu snippets

Tuesday, July 22, 2008

Dark Knight Shift: Why Batman Could Exist--But Not for Long

Batman is the most down-to-earth of all the superheroes. He has no special powers from being born on a distant world or bitten by a radioactive spider. All that protects him from the Joker and other Gotham City villains are his wits and a physique shaped by years of training—combined with the vast fortune to reach his maximum potential and augment himself with Batmobiles, Batcables and other Bat-goodies, of course. In the 2005 blockbuster Batman Begins, vengeful Bruce Wayne (played by Christian Bale) hones his killer instincts in the streets for seven years before landing himself in a Bhutanese prison, where he falls in with the mysterious League of Shadows, who teach him the way of the ninja. The Dark Knight, the next movie in the Batman franchise, opens in theaters Friday. To investigate whether someone like Bruce Wayne could physically transform himself into a one-man wrecking crew, turned to E. Paul Zehr, associate professor of kinesiology and neuroscience at the University of Victoria in British Columbia and a 26-year practitioner of Chito-Ryu karate-do. Zehr's book, Becoming Batman: The Possibility of a Superhero (The Johns Hopkins University Press), due out in October, tackles our very question. An edited transcript of the conversation follows.

What have comic books and movies told us about Batman's physical abilities?
There's a quote from Neal Adams, the great Batman illustrator, who said Batman would win, place or show in every event in the Olympics. Probably if I were Batman's handler, I'd put him in the decathlon. Although Batman is shown in the comics as being the fastest and the strongest and all these other things, in reality you can't actually be all of that at once. To be Batman properly, what you really need to do is be exceptionally good at many different things. It's when you take all the pieces and put them together that you get the Batman.

What's most plausible about portrayals of Batman's skills?
You could train somebody to be a tremendous athlete  and to have a significant martial arts background, and also to use some of the gear that he has, which requires a lot of physical prowess. Most of what you see there is feasible to the extent that somebody could be trained to that extreme. We're seeing that kind of thing in less than a month in the Olympics.

What's less realistic?
A great example is in the movies where Batman is fighting multiple opponents and all of a sudden he's taking on 10 people. If you just estimate how fast somebody could punch and kick, and how many times you could hit one person in a second, you wind up with numbers like five or six. This doesn't mean you could fight four or five people. But it's also hard for four or five people to simultaneously attack somebody, because they get in each other's way. More realistic is a couple of attackers.

How long would Bruce Wayne have to train to become Batman?
In some of the timelines you see in the comics, the backstory is he goes away for five years—some it's three to five years, or eight years, or 12 years. In terms of the physical changes (strength and conditioning), that's happening fairly quickly. We're talking three to five years. In terms of the physical skills to be able to defend himself against all these opponents all the time, I would benchmark that at 10 to 12 years. Probably the most reality-based representation of Batman and his training was in Batman Begins.

Why such a long training time?
Batman can't really afford to lose. Losing means death—or at least not being able to be Batman anymore. But another benchmark is having enough skill and experience to defend himself without killing anyone. Because that's part of his credo. It would be much easier to fight somebody if you could incapacitate them with extreme force. Punching somebody in the throat could be a lethal blow. That's pretty easy to do.

But if you're thinking about something that doesn't result in lethal force, that's more tricky. It's really hard for people to get their heads around, I think. To be that good, to not actually lethally injure anyone, requires an extremely high level of skill that would take maybe 15 to 18 years to accumulate.

Where does that number of 15 to 18 years come from?
That comes from my own training in martial arts and seeing how long it takes people to respond to simple situations—let alone the complexities of smoke bombs going off and people having big Batsuits on. No matter how much training you have, when we're subjected to a lot of psychological stress, we make a bunch more mistakes. The police talk about this when they use things called reality-based training. It takes years and years and years and years to have the poise to be able to perform when somebody is attacking you for real.

What's a realistic training regimen?
I didn't give a training manual in my book, but he'd want to do specialized weight training to build up an ability to work at a really high rate for maybe 30 seconds to a minute (the maximum time period associated with his fights). One of the early comics shows him holding an enormous weight over his head. That's not the right kind of adaptation toward punching and kicking. He's got to make sure he's doing all the skill training at the same time so that he's actually using the (physical) adaptations he's slowly gaining. In conventional martial arts, when people take weapons training, you're doing a kind of power-strength training.

What effects would all that training have on Bruce Wayne's body?
I looked up what DC Comics and some other books said (about Batman's physique). I settled on the estimate that Bruce Wayne started off at about six-foot-two and 185 pounds. I gave him a body fat of 20 percent (slightly below average) and a body mass index of 26. Let's say after 10 or 15 years, after he's become the Batman, he's weighing about 210 pounds and has a body fat of 10 percent. He's probably gained 40 pounds of muscle. His bones will actually be more dense, kind of the opposite of osteoporosis.

Are we talking freakishly dense bones?
The percentage change is actually quite small—maybe 10 percent. In judo, where people do a lot of grappling and throwing, you're going to have more density in the long bones of the trunk. In karate and other martial arts where they're doing a lot of kicking, there's going to be a lot higher density in the legs. Muay Thai (kickboxing) is a great example. They're always doing these low shin kicks. They try to condition the body by kicking progressively harder objects and for longer.

What about his reaction speed?
There is evidence that experts in something like football or hockey have an improved ability to perceive movement in time. In the book I use the example of Steve Nash throwing the ball, even though he can't see where the receiver of the pass is going to be. Experts are able to extract more information faster than others. It's almost like their nervous systems become more efficient.

How would Batman get enough rest?
The difficulty for Batman is he's going to be trying to sleep during the day. He's going to be really tired, actually, unless he can shift himself over to just being up at night. If he were just a nocturnal guy, he would actually be a lot healthier and have a lot better sleep than if he were doing what he does now, which is getting some light here and there. That's going to mess up his sleep patterns and duration of sleep.

Wouldn't fighting Gotham's thugs every night take its toll?
The biggest unreal part of the way Batman's portrayed is the nature of his injuries. Most of the time, in the comics and in the movies, even when he wins, he usually winds up taking a pretty good beating. There's a real failure to show the cumulative effect of that. The next day he's shown out there doing the same thing again. He'd likely be quite tired and injured.

Is there any indication in the comics of how long Batman's career lasts?
The comics are really vague on this, of course. In Frank Miller's The Dark Knight Returns, he deliberately shows an aging Batman coming back after he's retired, and he highlights him being tired and weaker. Somewhere around age 50 to 55, he should probably retire. His performance is going down. He's always facing younger adversaries. That is well at the end of when he's going to be able to defend himself and be able to not have to deal that lethal force. This was actually shown in an animated series called Batman Beyond.

Oh right. It's the future; Batman is old and he trains a kid to replace him.
You're familiar with that one? What we learn is that Batman, when he was older but before he retired, actually picked up a gun against a thug because he had to. His skills had let him down so that he wasn't able to defend himself without harming another person. So that's when he decided to retire.

How would all those beat-downs have affected his longevity?
Keeping in mind that being Batman means never losing: If you look at consecutive events where professional fighters have to defend their titles—Muhammad Ali, George Foreman, Ultimate Fighters—the longest period you're going to find is about two to three years. That dovetails nicely with the average career for NFL running backs. It's about three years. (That's the statistic I got from the NFL Players Association Web site.) The point is, it's not very long. It's really hard to become Batman in the first place, and it's hard to maintain it when you get there.

There's research suggesting that concussions might cause depression in NFL players. Could that be one reason why the Dark Knight is so brooding?
I went through a lot of comics and graphic novels and I only found a couple of examples where some of those blows to Batman's head had the effect of something like a concussion. Whereas in reality, that would be a very likely outcome. He's able to offset some of the physical damage to his head because of the cowl—it works a bit like a helmet. But these things would definitely add up. Since they don't admit that he has concussions, you can't really ascribe repeated concussions as the reason why he's brooding.

Do you think Batman would take steroids to heal faster?
No. There is one comic where he did go on steroids. He went a little crazy and he went off them again.

How many of us do you think could become a Batman?
If you found the percentage of billionaires and multiply that by the percentage of people who become Olympic decathletes, you could probably get a close estimate. The really important thing is just how much a human being really can do. There's such a huge range of performance and ability you can tap into.

from scientific american

Friday, July 18, 2008

Top 10 reasons to hate the iPhone 3G

Is the iPhone 3G really deserving of the nickname Jesusphone?

Sure, the iPhone 3G is a groundbreaking phone. There's a lot to love about it… the amazingly easy-to-use touchscreen interface, amazing video playback, a big, bright, high contrast, high-resolution display that's the best of any smartphone on the market, and a web browser that's as good as any you'd use on a desktop computer. Not to mention Apple's new MobileMe service which will provide over the air syncing of your email, contacts, calendar, tasks and photos with your home or office computer — no plugging in required.

But there are a lot of big disappointments with the iPhone 3G too. Some of them are stubborn commercial decisions Apple has made; others look like oversights, and others are fundamental flaws in the design of the phone itself.

Think I don't know jack? Before you post an angry comment, read through the 10 points and then tell me what you think.

#1 No upgrade to the camera

The camera in the first-gen iPhone was only two megapixels with no flash. "Fair enough," I thought… "it's a first-gen product. They have to leave themselves room to move for the upgrade they'll surely put into the next-generation iPhone." No such luck. The camera in the iPhone 3G is exactly the same as the first-gen one. Still stuck at two megapixels. Still unable to cope in low-light and still no flash. Oh, and there's no video recording capability either, even though this has been found on phones for the last five years or so.

iPhone 3G: 2 megapixel camera, no flash, no video, no optical zoom
Other phones: up to 5 megapixel cameras, optical zoom, lens-based autofocus, flash.
Verdict: Smackdown by other phones.

#2 No Adobe Flash support

Undeniably, the iPhone has the best web browser of any phone on the market. But when you hit a web page with Adobe Flash in it, you'll just get an empty space with a 'missing plugin' icon. Apple says Flash would run too slowly on the iPhone, but in reality, it's probably more to do with Apple wanting to promote its competing web app development technology, Sproutcore.

Apple realises the 'mobile web' is at a tipping point… if it can get enough momentum behind developers coding sites specifically for the iPhone, it will help sales of the iPhone along in the long term. (That said, unlike Flash, Sproutcore is an open standard that theoretically works in any web browser that supports Javascript, so it could be widely supported by all handset makers if their phone web browsers got better.)

For a laugh, check out Steve Jobs demonstrating the web browser on the iPhone. When he views The New York Times, up pops the 'missing flash' icon.

iPhone: no Adobe Flash support
Other smartphones: Flash Lite support, or full Flash support on Windows Mobile.(Admittedly Flash support on other phones isn't great either, but then, they're not running a full computer operating system like the iPhone is, where it would be trivially easy to port Flash across to run on it.)
Verdict: Other phones win by a narrow margin.

#3 No instant messaging

Despite the fact that the iPhone comes with unlimited data plans (in the US at least; Australian plans haven't yet been revealed) Apple has hobbled the iPhone's ability to do instant messaging.

Rather than sending instant messages over the internet to friends, the iPhone sends them by SMS. Since Apple has great instant messaging software for Mac called iChat, this is undoubtedly a concession to phone companies. SMS is widely considered to be the most expensive data service in the world, with each message only 165 characters long but charged by phone companies at around 20c per message. Multiplied out, that equates to 1.3 million dollars per gigabyte of SMSes. (By comparison, Aussie mobile network Three offers 1GB of high speed internet usage for $15.)

Oh yeah, and forget about chatting to someone who's sitting at a computer using the iPhone. Heaven forbid you might want to chat to someone using MSN/Windows Live Chat, Google Talk, AIM, ICQ, Facebook or any of the other popular chat protocols.

Hopefully, this ludicrous situation will be plugged by third-party application developers who will develop internet-based chat clients for iPhone. However, Apple has said that it will not allow applications to run in the background on the iPhone; instead, the developers must run an internet-based service, send a message to Apple servers, which will then send a message to the iPhone to alert the user to open the app. Yes, it may save battery life on the iPhone, but no, it's not exactly convenient.

On a Blackberry, the Blackberry Messenger just sits quietly in the background. If your phone is on, so is Blackberry Messenger. It's 100% reliable. It doesn't send messages using a stupid method like SMS. It uses the Blackberry's unlimited internet access. And yes, Blackberries do have good battery life.

iPhone 3G: SMS is the only way to instant message people.
Other smartphones: A large variety of instant messaging software that can send messages using the internet capability of the phone.
Verdict: iPhone is shamed by other phones.

#4 Totally impractical for international travel

The iPhone downloads full emails, attachments and all, when you view them on the iPhone. If someone sends you an email with several megabytes of photos attached, that's how much data has to be downloaded by the iPhone. That's fine if you're in your home country and have an unlimited data plan. But go to another country and see how much it costs you — you can expect to pay up to $20 per megabyte. Your roaming charges will soon be running into hundreds of dollars.

Not to harp on about the Blackberry, but when you roam with one of them, it's quite cheap, because the Blackberry servers downscale images to perfectly fit the size of the Blackberry screen before sending them — a huge saving in data transfer charges, and messages are heavily compressed before transmission, etc. In fact, even heavy Blackberry users may be surprised to learn that they use less than 5MB of data per month.

iPhone 3G: It's the data equivalent of the gas guzzling SUVs that GM suspended production of this week.
Other smartphones: Well, there are certainly other data guzzling phones. But Blackberry is a perfect example of a smartphone that's made for roaming.
Verdict: Blackberry wins

#5 Not compatible with Bluetooth car kits or headphones

Apple has Bluetooth wireless in the iPhone, but it only works with a handful of wireless headsets. Forget talking handsfree on Bluetooth car kits or using the iPhone with stereo Bluetooth headphones. You could expect those sorts of features from the world's leading music player, but not the iP… oh, wait.

Considering Apple wants the world to take the iPhone seriously for its phone capabilities, it's truly incredible that it has hobbled the Bluetooth audio capability so much. Could it be because it wants to make money from car equipment manufacturers who build an iPod dock connector into their car stereos?

Caveat: this comment is based on what we know about pre-release versions of the iPhone 2.0 software. It's possible Apple will have fixed this in the release version of the iPhone 3G. 

iPhone 3G: only works with Apple's mono Bluetooth headset and a handful of other companies' similar units. No support for Bluetooth stereo or in-car Bluetooth handsfree.
Other smartphones: many support stereo Bluetooth for streaming to headphones or a stereo, and most models work with Bluetooth car handsfree units (though there are still compatibility glitches between brands, admittedly.)
Verdict: Other phones win

#6 No cut and paste

This one is truly hard to understand. Apple brings out one of the world's most advanced smartphones in terms of user interface, and somehow forgets to put in cut and paste... probably the only smartphone on the market that doesn't have it. The mind boggles. (Also something that Apple could conceivably fix by the time the iPhone 3G is released… here's hoping.)

iPhone 3G: No cut and paste.
Other smartphones: Well, yeah, duh. They have cut and paste.
Verdict: Decisive victory for other phones.

#7 Non user-replaceable battery

It's a sad fact about rechargeable batteries: the first time you recharge them, their maximum capacity degrades. After a few hundred recharges, their capacity is down to something like half their original capacity. Normally, this is annoying, but manageable — you just swap the battery out for a new one, or get a second battery and swap between the two of them until the first battery is toast.

Not so with the iPhone. Its battery is sealed up tightly inside the nearly-impossible-to-pry-open casing (believe me, I've taken the back off an iPhone and that sucker is not meant to come apart… Apple must be replacing the casing of iPhones it services). Apple will then install the battery for you (in the US it costs $US85.95) and post it back to you. Oh, and you can pay them extra $US30 for the privilege of renting another phone from them to use in the meantime.

Not only is this massively inconvenient, it's a cunning attempt by Apple to get people to simply buy a new iPhone when the battery finally dies. People will be asking themselves… "do I pay $105.95 to get my old iPhone battery fixed, or do I pay $199.00 to buy the latest and greatest model of iPhone?" I know which one I'd pick, and I bet that's central to Apple's business plan.

iPhone 3G: Battery sealed inside the case. Costs a hundred bucks and considerable inconvenience to get it replaced.
Other smartphones: Well, yeah, duh. You just unplug the battery and put a new one in.
Verdict: Crushing loss to Apple.

#8 No MMS

So you've snapped a nice photo on your iPhone and you want to send it to a friend? You'd better hope they have email on their phone, because that's the only way you're going to be able to send it to them with the iPhone. For some reason, despite its ridiculous decision to force all instant messaging through SMS, Apple has totally left out MMS (picture/video SMSes) from the iPhone.

iPhone 3G: No MMS support. You will send your photos using the Apple-authorised method, by email.
Other smartphones: Well, yeah, duh. They have MMS.
Verdict: Own-goal by Apple.

#9 No turn-by-turn navigation

Despite building a GPS satellite navigation receiver into the iPhone, Apple has stopped short of offering voiced, turn-by-turn navigation into the device. Yes, you can plot directions from your current position to somewhere else, and you can watch yourself as a little dot on the map, but have you ever tried doing that in a car? I have … on my Blackberry. I nearly crashed.

If you're thinking I'm being a bit overly critical (isn't it a "nice to have" feature than a necessity?) compare Apple to Nokia, which has been offering voiced, 3D, turn-by-turn navigation on its phones for a couple of years now. Having a Nokia N78 saved my bacon recently when I realised I was totally lost and didn't have a street directory with me. I also had a Blackberry with me that has 2D map routing similar to what's on the Blackberry, and it sucked, because it was like reading a map constantly while driving.

iPhone 3G: No voiced, 3D turn-by-turn navigation.
Other smartphones: OK, so it's not a standard feature on all phones. But Nokia, which has over 50% market share in Australia, has been shipping it with its phones for the last couple of years.
Verdict: Nokia wins.

#10 Stunning hypocrisy

At Apple's last presentation on the iPhone (March 6th 2008), Apple marketing chief Phil Schiller ridiculed market leader Blackberry for the complexity of its push email service, pointing out that your messages have to pass through a RIM messaging server and a network operations centre before they're sent out to your phone. Plus you have to pay extra for the service.

With the iPhone 3G, Apple introduces MobileMe, a service that … passes your email through an Apple messaging server before it is sent through to your phone. And it costs $AUD119 per year extra. Spot any similarity with the Blackberry business model?

It seems stunningly hypocritical for Apple's to criticise the technology of the market leader in the US smartphone space, then adopt the same technologies in its own product. On the other hand, I'm glad it has… but I'm flabbergasted at Apple's audacity in working on a service while at the very same moment criticising others for doing it.

iPhone 3G: made by a company dominated by self-serving hypocrites.
Other smartphones: let's be honest... made by companies dominated by self-serving hypocrites.
Verdict: A
pple is on even footing with other handset makers. Welcome to the industry!

from apcmag

5 reasons to avoid iPhone 3G

The 5 real reasons to avoid iPhone 3G:

  • iPhone completely blocks free software. Developers must pay a tax to Apple, who becomes the sole authority over what can and can't be on everyone's phones.
  • iPhone endorses and supports Digital Restrictions Management (DRM) technology.
  • iPhone exposes your whereabouts and provides ways for others to track you without your knowledge.
  • iPhone won't play patent- and DRM-free formats like Ogg Vorbis and Theora.
  • iPhone is not the only option. There are better alternatives on the horizon that respect your freedom, don't spy on you, play free media formats, and let you use free software -- like the FreeRunner.

"This is the phone that has changed phones forever," Mr. Jobs said.

We agree. A snake oil salesman not satisfied with his business of pushing proprietary software and Digital Restrictions Management (DRM) technology into your home, Jobs has set his sights on getting DRM and proprietary software into your pocket as well.

There is a reason so much emphasis was put on the visual design of the iPhone. There is a reason that Apple is so concerned about unsightly seams that they won't even let you change the battery in your own phone.

Apple, through its marketing and visual design techniques, is manufacturing an illusion that merely buying an Apple makes you part of an alternative community. But the technology they use is explicitly chosen to divide people into separate digital cells, and to position Apple as sole warden. When your business depends on people paying for the privilege of being locked up, the prison better look and feel luxurious, and the bars better not be too visible.

Wait, locked up? Prison? It's a phone. Aren't we being a little extreme?

Unfortunately, we are not. The extreme here is represented by Jobs and Apple. The iPhone is an attack on very old and fundamental values -- the value of people having control over their stuff rather than their stuff having control over them, the right to freely communicate and share with others, and the importance of privacy.

The iPhone does make phone calls, but it is not just a phone. It is a general-purpose computer, more powerful in terms of hardware than the ones we might have had sitting on our desks just a few years ago. It's also a tracking device, and like other proprietary GPS-enabled phones, can transmit your location without your knowledge.

As of November 2007, 3.3 billion people in the world had mobile telephones, and the number continues to rise rapidly. For many of these people, phones are becoming the most important computers they own. They are vital to their communications and they are with them all the time. Of all the technology people use that could be turned against them, this is one of the most frightening possibilities.

But there is an important difference between the iPhone and prior general-purpose computers: The iPhone is broken, on purpose. It is in theory capable of running many different kinds of programs, but software applications and media will be limited via Apple's ironically named Digital Restrictions Management technology -- "FairPlay".


Apple's DRM system monitors your activities and tells you what you are and are not allowed to do. What you are not allowed to do is install any software that Apple doesn't like. This restriction prevents you from installing free software -- software whose authors want you to freely share, copy and modify their work.

Free software has given us many exciting things on the desktop -- the GNU/Linux operating system, the Firefox web browser, the suite, the Apache webserver that runs most of the web sites on the internet. Why would we want to buy a computer that goes out of its way to obstruct the freedom of such creators?

This system is not Apple's only FoulPlay. iPhones can now also only be activated in stores -- despite the fact that in the U.S., the Register of Copyrights ruled that consumers have the right to unlock their phones and switch to a different carrier.

Fingerpointing (and we don't mean the touch screen)

Jobs would have us believe that all of these restrictions are necessary. He nods and agrees when we complain about them, and says that he doesn't like them either. He claims that Apple is forced to include them for our own good -- for the safety of the whole telephone network, and to allow access to all the movies and music we want.

But it's been a year and a half since Jobs, under pressure from the public, spoke out strongly against DRM and in favor of freedom. With great hesitation, he allowed a handful of files to go DRM-free on iTunes, but kept in place the requirement that they be purchased using the proprietary, DRM-infected iTunes software. Since then, he has done absolutely nothing to act on those words. In his movie and video ventures, he has continued to push DRM. And now he's bringing it to mobile software applications as well. It's become clear that those words were a ploy to defuse opposition.

The truth is that there are thousands of software, music, and media creators who want to share their work more freely. It's funny -- as in reprehensible -- because Apple's OS X operating system was in fact largely built on software written by people who voluntarily made their work free to others for further copying, modification and improvement. When people have the freedom to tinker, create, and innovate, they make exciting and useful creations. People have already been writing their own free software to run mobile platforms. The telephone network is still standing.

We know Jobs is afraid of competition, and is manufacturing threats and excuses. This is simply a business decision, and it's a kind of business we shouldn't support. Jobs wants the iPhone to restrict you because he wants your money and increased control is a means to that -- he wants to take as much from you as possible, give you back as little as possible, and keep his costs at the absolute minimum. He's trying to make sure that nobody writes software for the iPhone to do things that he doesn't want the iPhone to be able to do -- such software might make FoulPlay less foul, play alternative media formats, show the user exactly what's being communicated from the phone to the people monitoring it, or even disable transmission of that information.

Being the future we want to see

rtunately, we will soon be able to have all the convenience of a mobile computer that also makes phone calls without selling our freedom to Apple, Microsoft, BlackBerry, or anyone else. The Neo FreeRunner is a promising free-software phone, being developed in cooperation with the same worldwide community responsible for the GNU/Linux operating system. These are creators who want to share their work and who want you and others to be able to do what they did -- build on the work of people who came before them to make new, empowering devices.

Jobs built on the work of people before him too, only his answer is to kick away the ladder and try to prevent anyone else from doing what he did. His customers are fighting back -- according to Apple in October 2007, over 250,000 of the 1.4 million iPhones sold were unlocked by their users. Rather than embracing this, Jobs thinks it should be stopped.

We have a choice. The FreeRunner doesn't yet do as much as the iPhone and it's certainly not as pretty. But in terms of potential, the fact that it's supported by a worldwide community of people rather than a single greedy, dishonest and secretive entity puts it light-years ahead. We can trade our freedom and our money to get something flashy on the surface, or we can spend a little more money, keep our freedom, and support a better kind of business. If we want businesses to be ethical, we have to reward the ones that are. By not enriching companies that want to take away our freedom and by rewarding those that respect us, we will be helping to bring about a better future.


Tuesday, July 15, 2008

Useful Shortcut Keys In Ubuntu

General keyboard shortcuts

Ctrl + A = Select all
Ctrl + C = Copy the highlighted content to clipboard
Ctrl + V = Paste the clipboard content
Ctrl + N = New (Create a new document, not in terminal)
Ctrl + O = Open a document
Ctrl + S = Save the current document
Ctrl + P = Print the current document
Ctrl + W = Close the close document
Ctrl + Q = Quit the current application

Keyboard shortcuts for GNOME desktop

Ctrl + Alt + F1 = Switch to the first virtual terminal
Ctrl + Alt + F2(F3)(F4)(F5)(F6) = Select the different virtual terminals
Ctrl + Alt + F7 = Restore back to the current terminal session with X
Ctrl + Alt + Backspace = Restart GNOME
Alt + Tab = Switch between open programs
Ctrl + Alt + L = Lock the screen.
Alt + F1 = opens the Applications menu
Alt + F2 = opens the Run Application dialog box.
Alt + F3 = opens the Deskbar Applet
Alt + F4 = closes the current window.
Alt + F5 = unmaximizes the current window.
Alt + F7 = move the current window
Alt + F8 = resizes the current window.
Alt + F9 = minimizes the current window.
Alt + F10 =  maximizes the current window.
Alt + Space = opens the window menu.
Ctrl + Alt + + = Switch to next X resolution
Ctrl + Alt + - = Switch to previous X resolution
Ctrl + Alt + Left/Right = move to the next/previous workspace

Keyboard shortcuts for Terminal

Ctrl + A = Move cursor to beginning of line
Ctrl + E = Move cursor to end of line
Ctrl + C = kills the current process.
Ctrl + Z = sends the current process to the background.
Ctrl + D = logs you out.
Ctrl + R = finds the last command matching the entered letters.
Enter a letter, followed by Tab + Tab = lists the available commands beginning with those letters.
Ctrl + U = deletes the current line.
Ctrl + K = deletes the command from the cursor right.
Ctrl + W = deletes the word before the cursor.
Ctrl + L = clears the terminal output
Shift + Ctrl + C = copy the highlighted command to the clipboard.
Shift + Ctrl + V (or Shift + Insert) = pastes the contents of the clipboard.
Alt + F = moves forward one word.
Alt + B = moves backward one word.
Arrow Up/Down = browse command history
Shift + PageUp / PageDown = Scroll terminal output

Keyboard shortcuts for Compiz

Alt + Tab = switch between open windows
Win + Tab = switch between open windows with Shift Switcher or Ring Switcher effect
Win + E = Expo, show all workspace
Ctrl + Alt + Down = Film Effect
Ctrl + Alt + Left mouse button = Rotate Desktop Cube
Alt + Shift + Up = Scale Windows
Ctrl + Alt + D = Show Desktop
Win + Left mouse button = take screenshot on selected area
Win + Mousewheel = Zoom In/Out
Alt + Mousewheel = Transparent Window
Alt + F8 = Resize Window
Alt + F7 = Move Window
Win + P = Add Helper
F9 = show widget layer
Shift + F9 = show water effects
Win + Shift + Left mouse button = Fire Effects
Win + Shift + C = Clear Fire Effects
Win + Left mouse button = Annotate: Draw
Win + 1 = Start annotation
Win + 3 = End annotation
Win + S = selects windows for grouping
Win + T = Group Windows together
Win + U = Ungroup Windows
Win + Left/Right = Flip Windows

Keyboard shortcut for Nautilus

Shift + Ctrl + N = Create New Folder
Ctrl + T = Delete selected file(s) to trash
Alt + ENTER = Show File/Folder Properties
Ctrl + 1 = Toggle View As Icons
Ctrl + 2 = Toggle View As List
Shift + Right = Open Directory (Only in List View)
Shift + Left = Close Directory (Only in List View)
Ctrl + S = Select Pattern
F2 = Rename File
Ctrl + A = Select all files and folders
Ctrl + W = Close Window
Ctrl + Shift + W = Close All Nautilus Windows
Ctrl + R = Reload Nautilus Window
Alt + Up = Open parent directory
Alt + Left = Back
Alt + Right = Forward
Alt + Home = go to Home folder
Ctrl + L = go to location bar
F9 = Show sidepane
Ctrl + H = Show Hidden Files
Ctrl + + = Zoom In
Ctrl + - = Zoom Out
Ctrl + 0 = Normal Size

(For those who want to configure your own keyboard shortcuts, you can do it at System->Preferences->Keyboard Shortcuts.)

Monday, July 14, 2008

DreamWorks Animation's Kung Fu Panda created with Linux-based HP workstations

Ed Leonard, CTO at the digital facility discusses some of the technology used on the film

By John Virata

DreamWorks Animation's Kung Fu Panda was created entirely on Linux-based Hewlett Packard workstations. The digital studio has been using HP workstations since 2001 successfully on its films. Ed Leonard, CTO at DreamWorks Animation discusses the company's digital workflow, history of the HP-DreamWorks alliance, and the benefit of multi-core processing systems.

DMN: Last time DMN spoke with DreamWorks, both DreamWorks and HP just announced an alliance bringing HP 's Intel-based Linux workstation into DreamWorks to help in the production  of Spirit: Stallion of the Cimarron. Has HP been providing DreamWorks with workstations since the alliance was announced in 2002?
Ed Leonard: Yes, the partnership started in summer 2001 when the first Shrek movie came out.  HP is our preferred technology provider. We have used HP Workstations on every movie since 2001.

DMN: How is the HP-DreamWorks alliance working out?
EL: Great, we love it. We have a great partnership with HP. HP is a very deep and broad company. The engineering depth allows us to find innovative products that meet our needs and helps us lead the industry in technology.

DMN: Has the relationship developed any new categories?
EL: Yes, it started with the transition to Linux, which allowed us to the lead industry to commodity based platforms, later we developed the Halo virtual collaboration, and now the DreamColor display that will launch in June. It has been a fruitful collaboration.

Kung Fu Panda was created entirely on HP workstations

DMN: Does DreamWorks still serve as a test site for new graphics workstation technology coming out of HP Labs?
EL: Yes, we have a very close relationship with the HP Workstation group and most groups in HP to get the most advanced products from research and development to production. It gives us a chance to see where platforms are going, and gives us a competitive advantage by giving us early access to the technology before others get to it.

DMN: Does Linux still serve as the animation platform of choice at DreamWorks or is there a mix of Linux, Windows, and other platforms?
EL: Yes for us, Linux is definitely the right platform. Our entire production process is built on Linux, from the workstations to the renderfarms.

DreamWorks Animation has been using HP workstations for the last seven years

DMN: Why does DreamWorks use AMD-based HP workstations?
EL: Historically, we have used AMD. We are now starting to use a mixed platform of both AMD and Intel-based systems.
We love that HP manages the complete platform behind the system. HP delivers best performance for both platforms, Intel and AMD.

DMN: DreamWorks has renderfarms in both its Glendale facility as well as in Redwood City. How are the systems set up so users can collaborate over such long distances?
EL: Something unique about our company is that we have a completely virtualized infrastructure and are able to access all capabilities from all of our campuses. So it looks like one renderfarm, but it is physically located in two places. We then combine the virtual infrastructure with Halo rooms to allow us to seamlessly act as one. We access the renderfarms from the Halo studio, and it is as if we are all working from the same location.
We actually had a consultant working for us in Southern California who was engaged to be married and hadn t finished their pre-marital counseling in Northern California. Rather than flying home for their last counseling session, they did their last session on Halo. We joked, calling it "Holy Halo."

Kung Fu Panda

DMN: How many systems and digital artists were employed in the creation of KFP?
EL: There are always many digital artists and systems used on our films. Just on KFP there were 400 HP Workstations, 1500 servers, and over 6000 cores. Render hours: 24 million, that is four times as many hours as the original Shrek movie.

DMN: Were there any technology breakthroughs in KFP?
EL: We are changing a little bit, but we can do most things. We are removing a lot more limits in the filmmaking process. KFP combines many of the difficult things you can do in computer graphics: you have action, character contact, and furry animals wearing clothing while they are doing Kung Fu fighting. KFP has a very traditional Kung Fu movie look to it in an artistic animation genre.
Making a film is always about compromise, characters can only touch so much " we ve actually had hug limits in some of our past films because making characters touch was so difficult. We are now making our systems less compromising for the filmmakers, which allows them to carry out their vision.
It s all about interactive performance: the evil side is complex, more visual, geometric, surface complexity. You want more believable characters to suspend disbelief when you watch our movies " the faster the workstations the more visual richness you can build into characters.

DMN: Is ToonShooter still widely used at DW?
EL: No, but we ve brought back ToonShooter for one scene in KFP because there is one scene that flashes back to traditional animation. ToonShooter is a 2D pencil test system, DreamWorks is moving more and more toward 3D animation, so it is not widely used anymore. In fact we had to dust it off a bit before we used it.

DMN: Were the animation tools used in the production of KFP proprietary tools or off the shelf solutions?
EL: We develop most of our tool suite ourselves. We have our own proprietary render and lighting tools.

DMN:  KFP is the first DW film to use dual-core processing workstations from the beginning, what did this allow you to do differently?
EL: It s about visual richness and computational power again. Multi-core allows a more interactive experience for our artists. On this film we implemented our rendering to be multi-core. We are able to render 25 times faster because we are using multi-core systems. So something that took multiple minutes now takes seconds. Then if you combine four machines you have 16 cores and can scale the time down even more. We ve also optimized our own tools and software to take advantage of the multi-core systems. For example, we ve optimized our interactive lighting tool and enhanced it for multi-core computing, which allows for almost real-time, live action directing.

from broadcastnewsroom

Thursday, July 10, 2008

Using Spring 2.x to Wire Model 1 Servlets

At work I am currently maintaining a older, Model 1 Servlet application. This application is working great in production, even though I think the overall back end design is about as pretty as the Elephant Man! I have always longed for better ways to add functionality to the code, and long for using Spring in parts of the applications.

Well recently I needed to migrate some Spring code from another application into this old one. My first thought is, oh crap, this is going to be interesting. Well things were interesting indeed, and for the positive. I found that there is a nice, clean way to integrate the Spring Framework with my current HTTP Servlets.

First I identified that I would like to have spring wire up some Database Calls in my LoginServlet. To do this I had to make a change to the web.xml to use the ContextLoaderListener class and load the applicaitonContext.xml file.


Now that these spring is setup to go, I have identified that the LoginServlet is going to be wired using some spring code. To do this I change my web.xml file to use the HttpRequestHandlerServlet in Spring for the LoginServlet servlet-class definition.

    <!-- servlet-class></servlet-class -->  

Next I need to reference the LoginServlet class that is in my application and have is wired with Spring. To do this I created a Spring bean in the applicationContext with the same id name as the servlet-name parameter in the web.xml.

<bean id="LoginServlet" class="">  
   <property name="hrEmpApptDataDAO" ref="hrEmpApptDataDAO"/>  
   <property name="oneCardAccountDAO" ref="oneCardAccountDAO"/>  
   <property name="oneCardSvcDAO" ref="oneCardSvcDAO"/>  

The last piece is pretty easy, just go to LoginServlet class and implement the HttpRequestHandler as so.

public class LoginServlet implements HttpRequestHandler { 

This will require you to add the handleRequest method to the servlet, which is what is called when you access the servlet from your applicaiton. The doGet and doPost, and other default methods are overridden and handled for you by default. That was it. Now I have this old servlet, which I can wire up and pass Spring JdbcTemplete objects to.

Top Ten Performance Problems and Their Solutions Submitted

Whether you're the developer or the user of a Java application you would like to see running faster, here are the top ten tips to use.

10) GregorianCalendar

This class is slow and is not synchronized. If possible avoid it and use Joda time

9) No time based animation

All animation should be time based so that if it's meant to be 1 second it happens in 1 second even if it's only 4 frames a second.
I hate the "all download have been finished" window that takes 10 seconds and uses the rest of the CPU available. Look at the timing framework.

8) No cache

Computing the same thing again and again, try to identify where it happens and cache the latest results.
Be careful not to create a memory problem when never releasing objects of the cache.

7) No feedback

If you don't provide a progress bar or a hourglass the user will think the application hangs or is slow. Provide some distraction, he won't notice it or at least he will know what it's doing. Use the JProgressBar or setCursor(new Cursor(Cursor.WAIT_CURSOR));

6) Logging

Logging is slow as it uses the disk and is synchronized (each thread needs to wait its turn).
The first fix is to log on a separate disk than the application is running.
You should also look at what you're logging and reduce it to only what is needed.

5) Premature optimization

What? Yes premature optimization is the root of all evil.
Moreover you may make it worst with incorrect assumptions.
I once profiled a framework where 40% of the time was spent on premature optimization.
Don't assume but measure and act. Measure and optimize during the late Beta phase of your project.
Profilers are your friends here.

4) Database

Database are often the weakest link of an application.
Measure, create indexes where needed, optimize queries and if still a problem change of database.

3) Network

Network is slow. Two things are slow, the connection to the server and the download of the information.
Cache all that come from the network when possible. Compress the information sent and received using GZipOutputStream or a GZip Servlet filter.

2) Software

Use the latest softwares available, especially Java 6 update 6.
Application servers and databases also compete between them to be the fastest. So the more recent version you have the faster it will be.
Another good example of performance improvement is FreeBSD 7.

1) Hardware

Who doesn't know Moore's law?
I once was asked to produce a benchmark for a Toolkit. After writing and running the code, I gave my results with as conclusion for better results run it on better hardware and they did.

First Indicators of an Over-Engineered Project

The problem with patterns, best practices, and idioms is the overuse of a single principle. Regardless of what you are considering, overuse of DRY can lead to "fat" layers and classes, overuse of Separation Of Concerns to many fine grained units, overuse of modularization to JAR, plugin, or just governance hell.

  1. You start to use terms like "potentially", "in future", or "scalable".
  2. You spend more time thinking of "encapsulation", "abstraction", and "decoupling", than the actual problem.
  3. You believe that with the amount of frameworks, libraries, and languages (better polyglot projects), the quality of the software will improve.
  4. You are able to replace every single concept, class and layer—but this feature actually cannot be derived from the client's requirements.
  5. Just looking at the code—you do not understand what happens—you need additional tools, products, and consultants :-) to understand it.
  6. You hate monolithic structures—so everything is configurable, replacable—of course, at runtime. If it becomes too complex, go to point 5.
  7. You start to implement a generator to tackle the complexity.
  8. Your configuration file is getting slightly bigger than your code.
  9. Your interface is so fluent that only domain experts understand the code. :-)

Common sense and the balance between concepts and idioms are the solution—but it's hard to find in real world. :-)

Faster Eclipse On Slower Machine

Disable mark occurrences

Open preferences (choose of project wide or global wide preferences is up to you) window and type "mark occurrences" in filter text box. Select the "Mark Occurrences" from the list box and remove the tick from "Mark occurrences of the selected element in the current file" check box.

Remove structured text validation

This time type "validation" into the filter text box in preferences window and select "Validation" from the list box. You' ll see some of file types that are promised to be validated. Deselect validation ticks from all of the file types in the list (you can do it manually later if you want to). You'll see a really very big difference in eclipse performance if you have big xml and wsdl files. For a example in my last J2EE project, my web.xml files contain 1400 and wsdl files contain thousands lines of text so eclipse couldn't handle all the validations while computer memory is avarage

Do not use subclipse plug-in

Subclipse consumes so much system resources and effects eclipse performance greedily in big projects. If you could, consider not to use subclipse especially in projects that contain thousands of code kept in subversion source repository. It's really become a very heavy-weight plug-in with heavy-weight code. You should feel better using subversion from the command line or from a seperate client

Consider converting your static code to a jar library

This advise can be possible more likely when you have static code automatically generated from static wsdl belongs to a web service. By this way you reduce the raw code size in project and use the code functionality from compiled classes to force eclipse to use fewer system resource.

Configure java virtual machine memory management start up arguments

In your eclipse.ini file, set -Xms40m and -Xmx256m args as your needs. This options define minimum and maximum memory usage bounds which passed to java virtual machine to manage eclipse application domain's memory allocation tolerance. You can tweak this values and experiment your optimum eclipse speed. Also if you have problems in eclipse's memory management in Linux os environments like having lots of out of memory errors, you should define permgen space argument in eclipse.ini file. Setting this arguments as needed, you will have very few (or not) sudden memory exceptions. Try these -XX:PermSize=128m -XX:MaxPermSize=128m values if you got for about 1GB ram in your machine.

Wednesday, July 09, 2008

Speed up ubuntu boot speed

1.Delete Residual Config packages:

go to System->Administration->Synapic package manager, click botton Status to check whether your system has residual config packages.If your ubuntu has the residual config packages,you have to mark the residul config packages for complete removing one by one,and click button Apply to delete them,please see below picture for details:


2.Delete incomplete packages.

open Terminal,and then implement :

sudo apt-get autoclean

this command can delete all incomplete packages

3.Delete isolated libraies.

sudo apt-get install deborphan

sudo deborphan

check if there are isolated libraies.

If have,please implement below commands to delete them

sudo deborphan | xargs sudo apt-get -y remove –purge

sudo apt-get autoremove

4. Disable unnecessary services

go to System->Administration->Services in order to shut down some unnecessary service, I shut down Bluetooth and Braille display management these 2 services.


Gnome and Nautilus Keyboard Shortcut Keys

Gnome Desktop Keyboard Shortcuts

Alt-F2 : Run Application Dialog

Alt-F1 : Open Applications Menu

Alt-F9 : Minimize Active Window

Alt-Tab : Rotate Current Window Focus

Ctrl+Alt+Left Arrow : Move Virtual Desktop Left

Ctrl+Alt+Right Arrow : Move Virtual Desktop Right

Ctrl+Alt+Shift+Left Arrow : Move Current Application Left

Ctrl+Alt+Shift+Right Arrow : Move Current Application Right

Ctrl+Alt+L : Lock Screen

Ctrl+Alt+Del : Log Out

Ctrl+Alt+Backspace : Restart Gnome (careful)

Nautilus File Management

Shift+Ctrl+N : Create New Folder

Ctrl+T : Delete (to Trash)

Alt+ENTER : File/Folder Properties

Ctrl+1 : Toggle View As Icons

Ctrl+2 : Toggle View As List

Shift+Right Arrow : Open Directory (List View)

Shift+Left Arrow : Close Directory (List View)

Ctrl+S : Select Pattern [enter pattern]

F2 : Rename File

Ctrl+A : Select All

Nautilus Navigation Shortcuts

Ctrl+W : Close Window

Ctrl+Shift+W : Close All Nautilus Windows

Ctrl+R : Reload Nautilus Window

Alt+Up Arrow : Open Parent

Alt+Left Arrow : Back

Alt+Right Arrow : Forward

Alt+Home : Home Folder

Ctrl+L : Location Bar

F9 : Toggle Sidepane

Ctrl+H : Show Hidden Files

Ctrl++ : Zoom In

Ctrl+- : Zoom Out

Ctrl+0 : Normal Size

A case for text-based DVD rippers

At a time when graphical DVD rippers and encoders propose to make the backing up of your movies just a click away, a text-based application may actually be the best tool for the job.

I've tried graphical rippers such as AcidRip, dvd::rip, thoggen, and RippedWire, with varying degrees of success. However, I've also had my share of headaches. Most times, any crashes or problems I experienced were related to the graphical components of the application or the desktop. It's not pleasant (to say the least) to leave your computer eating electricity all night, only to find in the morning that the ripping and encoding of a DVD failed because of an error purely related to GTK+ or Qt. It's even more frustrating when you realize that most of these applications are front ends to command-line programs. For instance, AcidRip is a wrapper for MEncoder (not that MEncoder doesn't need a front end), and RippedWire sits on top of HandBrakeCLI. Once you click the Go button, the application basically becomes a giant progress bar, hogging your desktop and system resources.


After having trouble running a GUI-based ripper/encoder on my fresh install of Arch Linux, I stumbled upon undvd. Its only core dependencies are lsdvd and MEncoder, so I decided to give version 0.3.1 a try.

Being a command-line application, undvd installed in the blink of an eye. Running with a DVD loaded in the drive provides you with a simple screen that shows you the titles available on the disc, and basic instructions on how to watch them using MPlayer, or rip them. After that, will exit. After deciding upon the title to rip, run, specifying a few, simple parameters. For instance, to rip and encode the first title of a DVD, together with an English audio and subtitle track, you can run -t 01 -a eng -s eng.

A more complex example rips the second track of a DVD (with a French audio track, without subtitles), forcing one-pass encoding. It assumes the DVD is encrypted (and requires libdvdcss to read) and rips straight from the optical disc, using a picture-smoothing filter and Xvid compression: -t 02 -a fr -s off -1 -u -n -f -x

The result (in both cases) is an AVI file of surprisingly good quality. Other options include forcing two-pass encoding, selecting a target size for the AVI file, and video scaling.

undvd rips and encodes the titles to the folder from where you called the application, so beware of disk space issues. The default is for undvd to dump the entire DVD to disk, after which you can remove it from the drive. undvd then rips and encodes from the ISO image on disk. This protects the DVD from overuse and eliminates failed operations due to read errors. However, you can also rip directly from the DVD or from a folder.

I'm usually a nitpicker when it comes to lack of options, but my first encounter with undvd managed to make me a believer. The interface is simple but effective, and pressing the Enter key a couple of times -- or writing a short string of arguments -- is more practical in my book than navigating through tabs and checkboxes.

h264enc and xvidenc

Shortly after discovering undvd, I came upon h264enc and its siblings, xvidenc and divxenc. The author describes h264enc as an interactive script and defends the advantages of a bash script over a GUI application in his informative FAQ. h264enc does basically the same job as undvd (it also uses MEncoder), but it allows for more fine-tuning of the encoding options. The beauty of h264enc is that you can make the ripping/encoding process as simple or as complicated as you want. You can pass through as few menus as possible, or you can take your time tweaking the options. Forty-one quality presets, including presets for portable devices like Apple's iPod, make your life easier.

The man page shows you everything you need to get started. Running h264enc -scan scans the DVD for information on chapters, audio, and more, and presents it to you. After that, you can run h264enc with the necessary parameters to your liking. For instance, to rip the second track of your DVD again, type h264enc -2p -p hq. This makes a two-pass encoding with the High Quality preset. h264enc then shows a series of interactive menus where you can choose your DVD drive, the video track to rip, the audio track, chapters, angles, and so on. You can also choose postprocessing filters (such as deinterlacing, noise removal, and image sharpening) and define the video bitrate, target size, audio codec, and more. If you're in a hurry or confused, simply choose the defaults. h264enc encodes the files to the H.264/MPEG-4 Part 10 standard, using the AVI container by default. You can choose other containers, such as Matroska Multimedia Container (MKV), Ogg Media (OGM), and MPEG-4 Part 14 (MP4). You can also store your settings for a particular job in a file, thus creating your own presets. Storing different settings for different jobs allows you to create batch jobs, which is a great feature to have at hand.

The quality of the final product varies according to the quality of the original video and the settings you choose. After some experimentation and following the advice of the FAQ, I achieved excellent results. Using three-pass encoding with a Very High Quality preset and a target size of 1400MB, for example, gave me a video file practically indistinguishable from the original DVD in terms of picture quality.


I use undvd for most tasks, and h264enc when I need more control over the output (for extreme high-quality rips, or movies with low image quality). Still in the realm of text-based applications, you also might want to consider RipDVD and HandBrakeCLI as sound alternatives. HandBrakeCLI in particular supports multiple CPU cores -- something most other rippers and encoders don't do. Although HandBrake isn't as easy to use as h264enc or undvd, a recent article may help with that. dvd::rip also supports multiple CPU cores, as well as the ability to set up a cluster to increase processing power.

These applications show that a command-line based workflow, in the context of common desktop tasks, still has its place, especially when you want power and simplicity.


Don't compare GNU/Linux with Windows or MacOS - they are not in the same game

Recently a blog post entitled "Why Desktop Linux is its own worst enemy has come across my feed-radar a few times. It's yet another in the long line of "Linux ain't ready yet" jeremiads and it doesn't really say anything new yet it got on my nerves. Why?

Unsupported statements

Like many such pieces, this one starts by making a statement as if it were fact while presenting no actual evidence. The "fact" used here goes along the lines of "Microsoft has shot itself in the foot with Vista and the only ones benefiting are Apple". Even if there were statistical evidence, the premise is mistaken. As I have said before the "success" of GNU/Linux cannot be measured in the same way as a proprietary OS. Apples and air people, you're comparing apples and air. I mean how can you tell how many Ubuntu installs came of a single CD?

Speaking about unsupported, the post then reels out that hardy perennial "not working out of the box". Apparently "even getting MP3s to play can be while using Linux can be problematic". Which distribution was that then? If playing MP3s out of the box is your thing try one of the distros that comes with that feature out of the box: Linux Mint, PCLinuxOS etc. Okay perhaps I've made that too difficult for people there, I mean perhaps they want "Linux" to be just "Linux": all distros being the same. You know like Windows with its seven different versions of Vista (with their range of hardware requirements) from a single supplier. To be fair the author does try to sound like she's not blaming GNU/Linux:

"Perhaps it's not the fault of Linux as much as it is the pervasiveness of Windows. Microsoft, after all, works with various hardware manufacturers to ensure their hardware works with Windows. And you still have driver problems crop up on occasion so what else can you expect with Linux."

Hmm, Microsoft works with hardware manufacturers? These would be the same manufacturers who said they weren't ready and yet Vista was still released. The Vista release was such a sore-point for Microsoft that a month ago another blog on the same site claimed that Microsoft is asking manufacturers to start testing the next Windows release now. But what follows that bit had me in absolute fits of laughter:

The average consumer just wants to be able to pop a CD into his optical drive, wait 10-15mins and have a working operating system.

The average consumer wants what? And in how many minutes? Has this blogger ever tried to install Windows? Sorry but this is just a ridiculous claim. Show me this average consumer who wants to install their OS? Show me any modern OS that installs in 15 minutes (best I've achieved is 18 and I'll assume live CDs are not allowed here). Most of the average users I know would rather buy a new PC than upgrade Windows. No, users wanting to install an OS in 15 minutes is a pure straw-man argument.

Another old chestnut

So what's next? Ah the old "why there are so many GNU/Linux distributions?" question is given a new coat of paint:

"What is it with the collective egos of Linux coders that if one distribution doesn't suit them that they have to go and make a new one"

Talk about missing the point. One reason there are "so many" distributions is because there can be and the ones that keep going are the ones that people find useful. Not every GNU/Linux user likes the idea of installing a whole desktop system just to set up a firewall, or a router or to re-cycle some of that hardware that Windows won't even get out-of-the-box for. Some want to play MP3s out-of-the-box, some want a small install footprint, a faster boot time or a longer support lifespan. Windows comes in different sizes to fit Microsoft's sales message. GNU/Linux comes in different flavours to fit the end-users' needs, wants, desires and just for-the-heck-of-it sense of curiosity. The author does offer a solution to the "problem":

Instead of rallying behind a single distro and making it the OS to beat, Linux grokkers tweak and promote their own Linux 'flavours'.

So let's dip into some analogies. Why are there so many amateur sports clubs, spare-time inventors and (best of all) tech weblogs? Why don't any of the sports enthusiasts just get behind another local club and make it the club to beat? And what's with the collective ego's of tech bloggers that if one poorly supported anti-GNU/Linux rant doesn't suit them they have to go and make a new one?

Here's a wake up call to those who keep asking the "why so many distributions?" question: as long as GNU/Linux is available under the GPL, there will be those who will tinker, tweak and create for no reason other than to enhance their own experience and maybe help a few others along the way. We tweak because we're allowed to and because, sometimes, it's fun.

The final straw

Their next point is not the one I laughed at the most, but after the others it was the straw that broke the camel's back (and I say that as a part-time Perl monger). Apparently the real litmus test is the documentation. Windows is — quite cleverly— not mentioned here but instead the praises of Apple are sung loud and clear. GNU/Linux documentation is apparently too archaic. Ubuntu is given some credit for doing a "pretty decent job" but apparently that's not enough. And as for the rest.. "Well?". Well what?

I've never purchased an Apple computer so I couldn't tell anything about the documentation. What I would say is that at the prices they charge for their machines I'd expect it to be perfect. But what documentation are we talking about here? The GNU/Linux system is made up of thousands of applications, some of which are well documented and others not so well. So are we talking about GNU/Linux documentation here or free software documentation? No I'm not nit-picking because you see there are thousands of excellent Howtos out there on a whole range of subjects and not all of them will have the word "Linux" attached but they all apply equally to applications that are run on GNU/Linux. Okay so maybe it could be better collated but so could a lot of things, including my CD collection. Another thing to remember is that there is a whole secondary market of books that has grown up to fill this very niche. Several of the publishers involved sponsor articles in this magazine and you'll find reviews of many tomes here as well. Yes they cost money but I'll wager it's less than the cost of that Mac that came with the shiny brochures.

The dummy race

Their closing argument is this..

In essence, until Linux becomes dummy-proof, it's not going to win over consumers. Make it easy, make it accessible - until Linux programmers get that, it's more likely that Apple will perhaps double its user base in the years to come at the expense of both Windows and Linux. It's not about the best OS winning - it's about the OS with the best user experience and Linux still isn't there yet.

What's not about the best OS winning? And winning what? The market? That's changing as we both write, and a few years from now probably won't resemble the current one. When I read that statement though I was reminded of the quote by Richard Cook:

"Programming today is a race between software engineers striving to build bigger and better idiot- proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning."

"Dummy-proof" is a moving target. It's one of those circle-of-life things (as Disney like to say). If you make something simple to use, there will soon come along another set of users too lazy, stubborn, or apathetic (but rarely too "stupid") to learn how to use it properly. The answer is not (always) to make it easier to use —by which most people seem to mean "hide half the functions"— but to make learning it more interesting. The quality of a user experience should not be judged by the cuteness of the help avatar or the number of steps in a wizard (or even by calling it a wizard). It should also be about how much it enhances your life/work, widens your perspective and awakens the child-like hunger to learn in you. It should make you want to show off what you can do to your friends. Well it is if you ask me but I'm not sure any software, free or proprietary, has achieved that yet. Still it's a good target.


As I said there have been a long line of posts like this, so why did this one get my goat? I think it was because it addresses software users as consumers. You see, for me, free software (including GNU/Linux) reverses the trend that separates software supplier and consumer. I am one of those who baulks when train drivers address passengers as "customers". I dislike it when TV viewers or radio listeners are treated like cash machines. Making someone a consumer reduces their contribution to fiscal terms and people are more than that. Free software makes users participants again they are not mere consumers. Sure they may start out that way, especially if they migrate from Windows, but in my experience they usually learn pretty fast that this game has different rules. Rules which let them play instead of just spectate. So you can't really say that because GNU/Linux is not addressing the needs of consumers it's failing, because largely it's not trying to. There is no free software marketing team, no advertising budget, no "mission statement". Free software and GNU/Linux are simply there. Pick them up, use them if you want. If you don't well that's fine, it's your freedom but as I said before — the game has changed so comparing apples and air is a bit of a waste really.

from fsm