30 April 2012

TV is DEAD!

TV IS DEAD, MUAHHAHAAH! But seriously, it is! TV is as dead as it can be, nothing is as dead as TV. I was watching HBO yesterday and realized that I am wasting my life looking at this idiot box. Conventional TV is a one way communication system, and although there are phones and tweets and websites that can augment the experience by reflecting feedback, feedback doesn't fit with the inner workings of a TV channel. Sending feedback to CNN is like prayer(a.k.a. GOD feedback), my "NEWS FAIL LOL" comment somehow doesn't translate into affecting broadcast scheduling (CNN has a plan(CNN has an agenda))! But since the Internet is king media, TV broadcasting tries to suck up to it and that is why channels now have crowd sourced news commentary programs. Like Aljazeera's "the stream" , where bloggers and vloggers battle to the death to win incredible prizes.

TV is old media, youtube is new media! The user generated content can surpass dogmatic broadcasting with no effort, simply by volume, our network is flooded with easily accessible content that is more relevant to our interests, by idiots for idiots! We have on demand free infotainment that can be consumed every which way but loose. And TV is following along, I have apps on my tablet that allow me to watch CNN, Aljazeera and RT while at job, school, toilet, subway or park! Yet I rarely do, what I watch is youtube videos, granted that both Aljazeera and RT have youtube channels (I am subscribed to both). But I don't want news on youtube, I want crowd commentary, way over the top misguided uninformed personal subjective biased opinion, but if I want useless noise why not watch FOX-NEWS? Because fox-news doesn't cover the topics I care about, they don't have the type of shows I like, and nobody does. Some times the Discovery Science channel shows something interesting, but it is rare that they show something I like plus even the things I like don't always fit in my schedule. Everything needs to be "on demand", and it is more and more as we breach the gap between TV and internet. TV is expensive, internet is cheap. I have a youtube channel, this blog and I'm making a Sites Google site website (which is awful), all free, plus I earn revenue from it(I don't actually, but you could). Granted that you will need to invest money if you want to have a more professional feel. But a low entry cost into content creation and distribution makes it possible for experiments filling a niche where I'm at. Big networks don't go for a specific viewer, they are trying to hit a mythological personality stereotype that fits with all of humanity, that neither nature nor THE GODS would have exist.

There are two shows I want to plug, the first one is an interest of mine since I'm a big Linux Fan Boy! It is Jupiter Broadcasting 's Linux Action Show , this is a weekly live show accessible on demand. This is a show tailor made for ME! The second one is also a weekly and live and on demand, The Atheist Experience TV Show!

So OK, we have two examples of net-shows working models. So now the big companies want to invent the Internet TV, a hardware contraption that makes the TV and the PC into one. Of course we are talking about the Apple and the Google TV. But I have had an Internet enabled TV for as long as I have had a computer, I remember using an S-Video cable to connect my 21 inch retro pipe sarcophagus style TV to the TV-out port on my GeForceFX graphics card, I have had youtube and video on demand on my TV since the day youtube was created. So yeah, they are really innovating (BS)! Two things you should remember when the big companies say they invented the Internet enabled TV, http://www.mythtv.org/ and http://xbmc.org/ ! So yeah, I'm returning the decoders to the cable company and getting my content over the web! And yes, Netflix and Hulu aren't available where I live.

27 April 2012

Mobile Blog Test

Ok, so here is an attempt to blog on the move! I don't know if kids these days will grant me the mobile claim since the post is written on a tablet device, and there is a big difference between tablets and smart phones. I'm using a 10 inch Samsung Galaxy Tab and trying to use the blogger android app. It works, but misses some features. The most important one is the spell checker, my spelling capacity is pathetic at the best of times, but typos are a concern for everybody when using on screen keyboards. I suppose that both are a matter of practice, and that with time as I write more posts mobilely both my spelling and touch typing skills will improve.

me blogging
Me Blogging in the bathroom!!!
How Mobile is that, lol!!!
A nice feature is the direct adding of pictures from the sd-card or the camera,  but you can't position them so I don't know where in the post they will end up. Alternatively you can mobile blog using the web browser. Now, on this device the web browser experience is clunky, the interface of the blogger blog compose page is not touch friendly. The images added in the app are just thrown after the text so there is definitely a need to use a browser to align them with the text for the paragraph to which they are relevant, but this is impossible with the touch interface since drag an drop really isn't as good in a touch device as people would think, but it's not that its clunky, it is clunky whit the mouse, the touch just doesn't work. Another problem is that if you switch between the app and the web methods it duplicates your pictures, and this just isn't good. I'm warning you, be very careful not to jobble(what? it's a word.) your nicely typed and composed blog posts by making the error of opening them in the app, there will be unforeseen consequences.

The mobile blogging experience with the blogger app is lacking, but easier to use than the blogger website in a touch controlled browser. You would expect that a java application would be easier to use than a javascript web applet. And it is easier to make it feature full, somehow blogger or google failed to care about that. If you want to mobile-blog use the blogger app, but go to a computer for the final edit. At this point of the blog I'm touch typing like a boss, using the onscreen keyboard like a PlayStation controller (and I am a master at that controller, it's like an art form for me), but this technology is more suited to 140 chars social post, not paragraphs of blog.

Now I am back at a computer! We rarely take the time to appreciate what is truly good technology, a nice comfortable qwert keyboard can always take the cake. Maybe it's just something that my generation likes and kids growing up and experiencing "new" technology will some day rant against the high productivity conventions of our time.

26 April 2012

VIVA LA Debian

GNU/Linux Distribution Timeline 12.02
GNU/Linux Distribution Timeline 12.02
http://futurist.se/gldt/
The most important distro in the Linux ecosystem is indisputably Debian ( and I don’t care what the FSF say )! There is a balancing act going in the Linux community, Fedora on the modern edge against Debian on the stable edge. Both trying to set a trend in computing, both underappreciated by the community. And I’m not trying to say that Linux is Fedora and Debian with whatever comes between them, as fine gradations from one philosophy to the other, there is no line between them with other distros at fixed increments denoting a shift in the concept. And when I say Fedora is modern, I’m not forgetting the hardcore more customizable suicidal distros that "real geeks" with free time use, it’s just for comparison purposes. Linux is more like a half-mesh network, where every node shares ideas with the others. And I say half-mesh since there is inheritance of technology among the nodes, with the core ( the Linux Kernel ) interconnecting the major branches, but the community ( the people ) interconnecting the smallest branches on the other end by following technology trends.

Let’s get to the point. GNU/Linux is a huge sea of code, it’s generated by a community of developers. In order to get this code to the users, developers package it in a distribution. Debian is a big huge stable Linux ( not strictly according to them ) distribution. The Debian project accomplishes the bulk of the work behind a usable stable operating system, it has everything needed to use your computer. In order to bring the variations needed to fill every niche of computing, other projects take Debian and refine it. You can use Debian for everything, granted you are up to the task of configuring the system. But it would be inefficient to do that at every install for every computer every time, easier to just install a preconfigured OS especially for the task. There are two names I recognize when looking through the gldt Debian branch, Knoppix and Ubuntu. Knoppix is something you should look at, Ubuntu you already know, vastly different but both Debian! Then of course you want to make sure that the more specialized needs of the users are also catered for, Ubuntu comes in two versions, one for the desktop and one for the server. Then there are distributions based on Ubuntu in the same way that Ubuntu is based on Debian, a very popular one is linuxMint. A strange linuxMint version is the LMDE (Linux Mint Debian Edition(The distro I am currently on!(it's good))), but all linuxMint is Debian based, and everything Ubuntu based is Debian based. PeppermintOS is as much Debian as 64studio and XBMCbuntu, they are all Debian!

Some Ubuntu fan boys are crying right now ( and yes, Ubuntu is so big that it has its own fan boys )! What you need to realize is that if Debian dies, the workload for the Ubuntu people will jump so far that they will have to turn to another Linux distro to take code from ( no offence ). So if you like Ubuntu or linuxMint or any other Debian based distro, click here!

23 April 2012

Stress Depression Frustration

If you are happy don't waste your time with this post, it's going to be just a pissed-off rant against the industry. Now, I love technology! So imagine how thrilled I was when my colleague brought me a monster of a server to play around with yesterday. We have a client that needs big strong servers for development purposes, so we ordered one. It is a supermicro dual cpu system, it has two Xeon 4 cores 8 threads ( so 16 threads ) 24 GB of RAM, no idea how much storage and we decided to put two Ati/Sapphire HD 6970 graphics cards for math purposes. And then it turns out he wants windows server 2008 ( just fail ). Oh my god the god of fail has cursed this land. So the system came to my house somewhere around 16:00 (4pm), on Sunday. It is 18:00 (6pm) Monday as I write this, I just go out of bed.

Issue one. The server doesn't have an Optic disk drive, it doesn't need it. I thought "no problem", I haven't used a CD or DVD for a while. But, I usually install some sort of Linux from USB and it's as easy as `dd image.iso /dev/sdb`. But that trick doesn't work with windows, so I though yeah right windows 7 on a flash was just format to FAT32 make active and bootable and copy the content's (mother of fail). It takes about 30 min to dd and it takes 30 min more for the copy so I wasted 1 hour and ended up with nothing. I tried some more stuff wasting more time. Finally figured out that you need an NTFS bootable partition with the files in order for it to work. All in all, 4 hours wasted.

Issue two. When I say that the system had two graphics cars I meant to say the system had one graphics card in it, and I had one fresh from the store in a box. If you have ever had the opportunity to assemble a SLI or crossfire system then you know how important a BIG case is. Well this one was big, but it was a rack mountable case 4U nice and long but a standard height/width (depending on how you look at it), so there was barely enough room. On top of that the case and the power supply are sort of integrated and you don't have a lot of free power connectors, the connectors I did manage to steal from the HDD section almost didn't reach the cards. Ant the PCI Express RAID controller cables where getting in the way of the crossfire bridge between the cards. I have no idea how much time I wasted on that.

Issue three. The install process, not exactly an issue. Everything was smooth and worked nice and fast, until we came to the updates. Why is Windows updating so slow, with all the money Microsoft makes they should have the bandwidth and mirrors needed for the update process to be supper fast. And maybe it is, maybe it's just that the files are bigger in windows compared to Linux, and that makes me think that Linux distros generally have better infrastructure than Microsoft. Time wasted hour and a half, or so.

Issue four. I love AMD, there is something about the underdog that society gravitates to. It's like Intel is so big and powerful and only AMD can save us (Intel is freaking awesome(just to be clear)). Since AMD bought Ati there has been a great effort to improve the proprietary driver for Linux, and this is awesome. But why don't they have a working driver for Windows Server 2008. The cards are identified and the drivers do work, but the Catalyst Control Center doesn't. I tried 3 different version of the windows 7 driver from the Ati website, than I tried the windows 8 driver, than I tried a driver from the Sapphire website. No happy, none joy. I just went to bed at about 04:00 ( 4am ). My colleague came to take the system somewhere around 9:00, it is going in the datacenter and we need to resolve the issues remotely. Remote driver maintenance on a windows system, we have so much fail coming up!

Now due to a general industry failure I have wasted my time, I missed to go to job on Monday so I lost money and it was for nothing since the client might not be satisfied by the performance of the graphics cards. And it doesn't end here, we have to solve this and find a way to check in what mode these two graphics cards work and a way to configure them. Microsoft needs to make their installation media usable, we are post optical, everything comes over the wire or on a flash device. Plus, their update concept is flawed, they need to fix it. Case and motherboard design needs to catch up with the trend of multiple huge long graphics cards. AMD/Ati need to test their desktop products's drivers on all possible platforms, not only am I not going to buy a workstation product from them just so that I can use it on windows server, I'm not going to buy anything from them until the fix this.

Blender, Absolute Triumph!

There are good cases to be made about Gimp and Inkscape, as a fully fledged graphics design suite. The only thing that I missed when I crossed over to Linux was the canvas rotate function in Photoshop. And wacom's support for the platform was nothing to write home about. In all honesty I found the Linux graphics design experience lacking, one year later however I can't imagine doing that type of work on a windows system. Back in my windows days I also liked to play around with 3D, and I recently got back to that hobby of mine. And the Linux experience of 3D design and modeling and animation and rendering is in no way lacking. The first piece of evidence I would like to present is:
© copyright Blender Foundation | www.sintel.org
This short movie is in part product of the open development model applied to art, it shows what the capabilities of Blender were at the time it was produced. It was made in 2010 and is my favorite of their open movie projects. If you go to the Blender website you can find more short movies, I suggest you watch Big Buck Bunny in case you are sad after watching Sintel and you need a laugh. The Blender project is probably the most shining example of FOSS success, the software is very modern and powerful and the development process seems to run very smoothly. Recently they added "Motion/Camera Tracking" that makes Blender an ideal choice for video special effects. It also has soft body simulation and cloth simulation, it actually has a build in game engine that can be used for simulations and interactive 3D applications. Here is a little experiment I made with cloth and monkey:
 
And a face I made:
By no means am I a professional but both this things took me no time to make, plus my mum says they are awesome. So there!

These are all features you would expect to have in this type of software. A feature that I don't remember using on windows in other 3D graphics suites is the UI customization ability. The user interface of Blender is like a mesh object, you can change and modify as much as you want to, you can make it look like the UI of any other piece of software that you are familiar with. Although eventually you just strip down everything to the absolute minimum, and just use the keyboard for everything, Blender is very pro oriented in that regard, once you get used to it ( and it doesn't take long ) you can chuck the menus aside and just concentrate on a full screen view of your model. That philosophy of working allows you to be very productive. Blender is based on the Python programming language and OpenGL , which allows the software to work on all platforms. Plus it has a very powerful API that allows you to change and extend the application, if you are full on geek you can write yourself custom tools or build a video game within blender. If some proprietary piece of 3d software is keeping you from crossing over to Linux, try Blender on the platform you are currently using. It may not be an argument for Linux but Blender isn't a strictly Linux thing, artists and graphics designers are using it and loving it on all platforms. Maybe I can't convince you to change your OS, but you should consider this 3D graphics suite.

19 April 2012

The samba tutorial. A learning experience.

DISCLAIMER
The setup proposed in the material should never be used on a deployment system. This is strictly for development purposes. Plus, I can't imagine a good reason to have samba installed on an actual web server.



I hope that my tutorials are as much of a learning experience for the people watching them, as they are for the guy making them (me). Linux is a big universe that you may not know enough about, and maybe you can't know everything about it. There is the core of the OS a.k.a. kernel called Linux, fundamental to the operation of the system yet hidden behind a silent veil of stability. A microcosm in itself whose machinations are reserved for the geekiest of geeks. Above it there is a herd of GNU userspace applications running POSIX (Portable Operating System Interface) compliantly across the endless fields of UNIX. Two parts merged in a multitude of distributions that are a testament of imagination, creations inspired by freedom and driven by the force of human endeavor. All in harmony spreading across the physical and the logical layers of existence itself, connecting not the technological but the pure human network. And you can run samba on it.


Samba is a network service allowing a UNIX or UNIX like system to masquerade (colloquially masquerade(I think)) as a windows system on the network, allowing the sharing of different resources. It is a service that wouldn't have been thought of in a perfect world, but is practically indispensable in ours. If you follow my blog you will know that I am new to Linux (and the unix world), and that my IT career is mainly in windows administration (there is no such thing). So probably the first gap in Linux adoption for me personally was network interoperability. I am trying to help other people see the light, so I will share the required entry knowledge in an attempt to spare you the frustration of tackling this difficult subject on your own. And then just point you in the right direction, so that you may continue your studies. The reason for crossing over were my interests in web development, so we take the point of view of a web developer as we look at samba. We do this in an openSUSE based virtual web/development server that is part of the B.L.H.T. .

We start of with a simple install and a simple setup of samba through YaST. The installation is done in a very YaST way. We initially just install the YaST module that is used to configure samba, and not samba server itself. Then we restart YaST and we enter it's samba configuration, from then on the installation of the actual samba server is handled automagically. The samba-server configuration module is also used to set the server to start at boot, and handles the configuration of the firewall. We delete the default shares (optional), and create a new share that gives write access to the apache document directory.

In the second video we take a brief look at the smb.conf file that was automatically generated by YaST. Then we cover some basic file permissions issues that everybody who configures samba needs to know. Samba at this point is set to allow everybody (and his brother) to read and write to the shared resource. Fortunately for us it's not that easy to make a Linux box insecure, but we manage anyway.Then I just write a very long sentence, whose only purpose is to take up space so that the blog page looks more esthetically pleasing to me, and I don't just freak out because of my OCD nature (unfortunately that just wastes your time and annoys you)!

We end things of by adding a new user to samba by issuing the `smbpasswd -a` command and setting a password for remote access. The thing you really need to grasp is that file permissions do not care for your user or password, the files still have their owner and group. To finally really have a write enabled shared resource, we change the owning group of the /srv/www directory and we give the new group write access. From then on, new files created inside will have their owners correctly set so that the web developer can just mount the shared folder on his system and work with the files as if they were on his system.

For more information about openSUSE and samba, click on them!

16 April 2012

OMG GUI Kerfuffle!

workspace
The work space.
I use Gnome, I like Gnome! But Gnome isn't what it was. At home I have two systems, both Gnome 2. One is Mint 11 and one is Ubuntu 10.04 LTS , and I use Mint 12 on my workstation at my job. The Mint 12 system is a mesh of Gnome 2 and Gnome 3, and it's awful. It's like the menu land of redundancy,  looks awful. You have the Gnome shell UI elements plus a Gnome 2 taskbar that doesn't belong, or if you prefer Gnome 2 the Shell doesn't belong. But I am happy because it helped me realize that given both options I use the Shell elements more. And now my two home systems are old, and it's time to re-install, I was considering the LMDE ( Linux Mint Debian Edition ) for the Mint 11 box because the screenshot on the Mint website is a classical Gnome 2 system, but when I tried it in a virtual machine it turned out it was so much rolling release distro that it had 1400 updates to install and after rebooting it went Gnome 3. So there is no going back anymore! But if I am going to use Gnome Shell I prefer it is in a green opensuse themed way. Post re-install there will be an opensuse system in my house no matter what, but opensuse KDE implementation is the most gorgeous looking thing ever. At his point I start thinking I need three computers on my desk. Or maybe I should go with something totally different. The idea is that the laptop stays Ubuntu and with the new LTS it is going to be current again, but I want to live in Arch for some time, but I need my systems to make my youtube videos. What if I go KDE... As for the box LMDE is a solution but then I have no opensuse install. And then I'm back at square one.

Linux runs 3 times on one PC!
Linux runs 3 times on one PC!
I use Gnome, I like Gnome! But I need to make the Gnome Shell transition. Unity is an option, Ubuntu is sort of an interest. Maybe I can go with the new Ubuntu at my job, but then we will have two Ubuntu systems running alongside. And I really liked the idea of having Gnome Shell and Unity on one desk at home in order to be able to compare them, and I don't want to re-install over and over again just to make a review of the new Unity ( it's not new ). Plus LMDE seems more and more like a nice Gnome Shell based thing. While wondering I went to the opensuse website and started looking around. Have you ever heard of Petite Linux. It is a nice experiment using the Enlightenment Desktop based on opensuse 11.4, it is nice. Some people might think it's not usable but, it's not entirely not usable. My problem with it isn't that I don't like it, I just don't have the patients to experiment and if I did I want to play around with Arch. If I go KDE I want to see the Fedora implementation, I need another computer, but I can't work on three computers at the same time just to scratch an itch. And I have been distro hopping ( fedora, debian, mint, pinguyos, ubuntu, opensuse, mint again ), it's time to settle down. Maybe I should make my own using the Linux From Scratch Books or the easy way through Suse Studio and the openSUSE Build Service.

It actually doesn't matter what I use since they are all nice and the all work. If you choose to use Linux than you probably like to have 'choice', but for a geek Linux is what a chocolate factory is to a fat kid. Too much versions with too much options. I spend about an hour a day wondering what to use, what to install, what to test and what to promote. And after that one hour I'm somewhere between nauseated and suicidal! But then I think about the Windows and Apple users, they have no choice.

13 April 2012

Why yes, I am a fascist!

I really beat my chest in public professing how anti fascism I am! And i just recently found out that I am a fascist myself! Let me explain. There is this watchdog group called the FSF ( fascist software foundation ( free software foundation ) ) that decide who to kill during the imminent uprising.

An awesome additional view on the subject is the Linux Action Show interview whit Richard Stallman , and their negative in the freedom dimension follow up! ( actually don't waste your time reading this, just go watch the two episodes ( come back if you want to hate on the FSF some more afterwards ) )

The problem isn't really a problem, It's just weird. The free software foundation doesn't endorse the free open source projects that have brought awesome to free software. This is a link to the website of the organisation that explains why, and you can keep reading to see why I disagree. The two main issues against the popular distributions are:
        -no or no clear or not strict enough policy against non free software
        -"blobs" in the Kernel
On the basis of these two rules the FSF excludes Debian and Fedora and other distros. Fedora is excluded based on nonfree firmware, which is double speak for hardware support. So in that case the free software foundation discriminates against hardware, dictating what hardware you can use, I thought only Apple did that! Whenever thinking about FOSS I always tend to think in the context of new users, so let's pull a new user out of our hat and see what happens to him. Our user has a laptop, he has had it for several years. It is old, but works. His problem is that it's too old for Windows 7 but Windows XP has too many virus issues.

http://trisquel.info/
http://trisquel.info/
So he hears about this thing GNU/Linux from a hippie friend of his. He googles it and ends up on the FSF website. From there he finds a download link to one of the truly free distros , and goes for Trisquel because it has the nicest icon. Now this is a guy who doesn't know much about computers, manages to go through the installation process. Post install his wifi may or may not work, he has a bison webcam that will never work ( but don't tell him, he is going to find this out after installing skype ), and his screen bleeps black from time to time. He google searches again and finds out he needs the proprietary driver from nvidia. Now to add insult to injury the FSF has decided that if it is easy to acquire a nonfree piece of firmware, than you don't comply, thus Debian ( which is otherwise blob clear ) wasn't in the list since they have an easy ( the FSF is trying to make easy sound like child molester ) way of making the system usable. But that means that if Trisquel is in the list then they got on the list by making it hard for our fictional user. What do I mean by hard? Here is what nvidia told him.
nvidia's Linux driver install instructions
nvidia's Linux driver install instructions 
What they didn't tell him is that he needs to be in runlevel 3, noveau needs to be blocked by an argument in the boot loader, he needs dkms, gcc and the kernel-dev headers and stuff.



At this point he thinks "To hell with google!" and he picks up the phone, he tells his hippie friend ( I hate hippies ) and gets the response "Don't worry Windows DUDE, just use the repos man! Repos are sharing and love man!!!". And he does. I like trisquel, but they are a bit... here is a comparison between my linuxMint experience and my Trisquel experience. Look at the screen shots, LOOK! I have used many distros but I have never had that problem before, somebody contact me and tell me what is going on. I also searched kernel and gcc, nothing! If you click on the Philosophy link on the gnu.org website you get the "four essential freedoms" and the second one is quote "to study and change the program in source code form", and that is good because without the compiler that is all you are going to get, source code form.

Let's postulate that the hero of this story manages to go through these issues and the system is usable. He decides to learn a bit more about this thing called GNU/Linux, so of course he goes to sleepychildkungfu's YouTube Channel ( I have literally no shame ), and then Gnash happens. It has sound, no video but fortunately it crashes before he can notice. Gnash actually works nicely with youtube, but it wasn't always like that. But flash games are a bit choppy and there isn't much on that topic on the web that deals with gnash. Will a windows user realize that he isn't using flash and can flash work in midori. How much googling does it take before he comes to the conclusion that he needs to install chrome. And when he does get it that not everything he needs is in the repo, everywhere you go they ask you if you want deb or rpm, are you a debian based distro or a fedora fork. New users may not know what that means, but it's a safe bet that you are going to have some sort "ubuntu users click here" option. But he isn't an ubuntu user, the FSF told him to use Trisquel! You can't promote unknown distros to new users because there is less general support on the web and from software developers. Our hero has been in a boxing mach with his system for 12 hours now. He just wants to put some movie on and watch till he falls asleep. Fortunately at least that works out of the box. Maybe tomorrow he will forget to write GNU in front of Linux in the search field, and discover something awesome that works for him.

My personal view on the subject is that the FSF is doing more harm than good by distancing themselves from good distros. Even if we let some blobs and non free packages in, it will only help with the promotion of FOSS. Our numbers are too small to demand from hardware and software  companies to support free open sourced firmwares and drivers and applications. Make a good operating system, gain popularity and influence and at one point the industry will start respecting your terms of conduct. Till then just worry about the new users and their experience. And make an effort not to dictate how a computer must be used, because you sound like a bunch of fascists!

12 April 2012

Gaming Development Philosophy

At first glance playing and gaming are two words that we use to convey the same concept. They are colloquially interchangeable in some contexts but not in others. You can say that you play a game, you can play the game of hopscotch or football. Or you can play BATTLEFIELD ( and you should! )!


But when you play BATTLEFIELD you can say that you are gaming, on a gaming platform and you are a gamer. Are football players gamers, are little hopscotch girls gamers? Maybe they are or maybe they are not, but I have never heard somebody refer to them that way! At this point we draw a distinction based on the device used. So is playing angrybirds gaming? No, not if you throw a bird and then you check your facebook, and then throw another bird. No. Try that with battlefield 3, you will get your ass handed to you in a paper bag and be told to go home. Battlefield is a game that requires a commitment to learn the gear and the environment. You need to understand how the team dynamic operates on a game by game basis, and function as part of the whole under changing circumstance. You need to isolate yourself from the physical world around you and clear your mind from the mundane. There is only you and the game in a pseudo reality where the consequences for failure are as real as death itself! There is no such thing as casual gaming, casual gaming is just playing! You play a game, and if the level of engagement with it is sufficiently high as to detach you from the world, you are gaming! An example is my experience with Gran Turismo 2 , which I played nonstop for 72 hours ( minus potty and eat time )! That is just my personal subjective way of viewing gaming. I'm probably just trying to define myself as separate from the people who tap on a mobile device and call themselves gamers, because that title means something to me, yet there are obvious distinctions between angrybirds and what I call real games. I tried to engage some of the mobile titles out there, the best one is the mobile edition of Dead Space, but the controls were appalling! There really is no room for the usual "anti popular stuff" rant in this post, so back to the subject of gaming development.

The key to an engaging gaming experience for me is the art of problem creation and motivation. As a developer you need to con the gamer into caring about what happens next, and isolate him from 'next' by an obstacle. If somebody has bought your game he/she will play through it (or at least most of it ). I bought "Alone In The Dark The New Nightmare", haven't finished it, never buying another Infogrames title ever again ( ever )! But at that point the idiots in marketing have done a good job, not you ( the developer )! You have done a good job when somebody sits down to play your game and stops playing when the alarm clock goes off to inform him it's time to get up ( as in it's morning ( as in he played all night )). When talking about motivation there are several separate components, but the most important one are the obstacles a.k.a gameplay. Games are fun because they present a problem solving challenge and good games are highly stimulating to the brain. Good games should'n be easy, but they should'n be so hard that they stress the gamer. I prefer a serious challenge but it isn't a must for a game to grab your attention, even easy tasks can keep the gamer engaged and entertained! Most games use alternation between moments of pressure and moments of calm. Reaching a perfect challenge level is impossible since it is subjective, that is why games ask for the gamer's preferred difficulty level. Control scheme and mechanics shouldn't contribute to difficulty, clunky controls don't raise difficulty, they are just frustrating. Exactly why is Mario always running to the right, to save the princes or to see the end of the level? Mario is actually running right to see what is to the right of the screen! In a sense Mario is an obstacle junkie who passes obstacles to reach more obstacles to pass. In games it really is the thrill of the chase and not the reward, since there isn't any reward, except the false feeling of accomplishment in the end!

Story telling.
Story telling.
In a game like Super Mario there isn't really a need to put too much story since that isn't the game's selling point. But the "saving the princess" concept is the same as the "find Sephiroth to save the planet" concept and titles like Final Fantasy who are story heavy make an effort to give more than gameplay. Games can have intricate story lines that can glue the player to a device, just like a movie can glue you to the screen or a book can glue you on your ass. People have always liked a good story filled with characters that they can relate to, that is why we have books, movies, tv series, pub songs, legends.

Exploration.
Exploration.
The story also needs to be relevant to the progress of the gameplay by providing information on the tasks ahead and must fit to the virtual world of the game. Most people like to travel to new places and explore, if you live in New York you go to Paris for your vacation. My travel agency doesn't offer any sunny post nuclear apocalypse destinations, unfortunately. But Fallout does, it offers an interesting environment to explore filled with people to talk to or trade with or just plain old murder. When creating a game you can and should let your imagination run amok and really go all out for creativity. Games are an art form, so there isn't a formula for making good games. On top of that, as with other art the measurement is strictly subjective!
rats
rats

When thinking about what a good game is, I think of the games that have made me stay up all night. Than I try and dissect them into separate components and find which of them kept me playing, but it is never a single thing. A good game maybe doesn't have the best story or gameplay or technology. But it does have a nice fit between it's parts, and it has that slightly original and slightly familiar appeal that keeps me engaged for hours.