the insigificant observer

My serious posts

Education 1.0 ready for Web 2.0 students?

Just for the sake of those who have not read it, I am copying the posts from the Litespeed forum here. The topic entitled “Is Education 1.0 ready for Web 2.0 students?” started by Mr Wan (HOD of IT) in NYJC. I thought it was rather intriguing so I replied back. This explains why I was called out after lecture on Friday.

Note: Please forgive the elementary errors in my replies as I typed it out quite fast.

{ 20 July 2007
Hi all,

Nothing to do with Chemistry this time. I just read this article (URL of article changed as linked file was stored locally on Litespeed) and found it interesting. You don’t have to read all of it. Can just scan through.

I am just gathering your views as to how technology can be used to engaged students of your generation for teaching and learning. The article spoke of using iPod ,podcasting for e.g. as possible tools. Other articles I have read talked about gaming etc.

share your views here pls, both For and Against. I would like to hear them. And even if I can’t implement them within these 2 yrs, your views and suggestions will benefit future generation of NY students.

Just me,

Mr Wan(HOD IT)
}

{7 January 2008 20:16

I dunno if it is too late for this reply considering the date of your post but I will still post here.

In my opinion, given the nature of a JC, I feel most of us are not ready to embrace Web 2.0 or Ajax technologies. Since in most part, Web 2.0 relies on two-way communication to realise its full potential, we have already failed miserably. An example would be the forums we have. Although, there are some faithful posters, the numbers are small compared to the number of pageviews they gather. This implies that (most) students are still having the mindset of being information recipients as opposed to information givers.

Another issue that prevents this from taking off in a JC is the fact that most of us do not have laptops or devices that allow us to access low-cost internet on the go. Unlike the polys or universities (or colleges in other countries) where owning one is almost a necessity, this is not a situation in a JC. Without such devices, most of us tend to use the net at home or sometimes in school (mostly for printing of lecture notes). I may be wrong in my observations but that is what I have seen in the general student population.

We could however conduct an experiment. Have an NY blog (if there is one already, I have not heard of its existence). Let anybody publish their thoughts of school life or anything for that matter, allow them to be anonymous if need be, but verify they are NY students first. All posts are to be submitted for verification before posting to prevent abuse. Then analyse the response. In order ensure the maturity and to kickstart the project, we can let the JC2s post first. To ensure sufficient content, every 4 classes per week have to write a short “article” of their thoughts or any random subject. In this way, with 37 classes, in every term, every class has at least posted once.

To ensure readership, we can teach students how to use RSS feeds and readers so they do not have to keep checking back for updates. A free software would be the Sharpreader RSS reader I use. A real life example of the RSS feature already being implemented is the NYConnexions website. If its possible, we can also expand this feature to include possible school events or release of new lecture notes (Liddat dun have to keep checking back, especially during the holidays).

We could also revamp the current forum system to make it more attractive. An example is to give students the freedom to set their topics. Currently, we can only comment about mostly academic topics as dictated by the school (or you :) ). The exception is the suggestion box but as you have seen, not many people use it. Also, create an automatic email reply system so posters can be notified if someone replies instead of checking back regularly for sometimes no replies.

Ok, another reason why I doubt the entire system is the fact that a JC system is very academically and CCA demanding in itself leaving us with little time for “other” activities. Unlike the polys or uni students who tend to have more time on their hands to contribute, some (or most) of us may just can’t be bothered.

Finally, what are the statistical results for the recent surveys posted on the school’s homepage? The latest being the issue about photocopying machines I think. How many students have actually attempted the survey? How many have actually completed everything? How many gave their some thought into the questions especially in the open-ended ones? I think that can be a good indicator in itself.

P/s, I tried doing the survey but I dunno why it refused to accept my answers. It kept saying that I have selected both genders when obviously I would not pick both!

My reply
}

{7 January 2008 23:13

Hi Kheng Meng,

1st of all, thanks for your contribution. I think you are quite right in many of the points you made. It reflects maturity on your part. I don’t know how we can solve this prob though. The solution is really to start somewhere. But in fast pace Singapore, this becomes a low priority…:-(

Maybe we can explore the Nyconnexion idea further. See me after lecture or drop me an sms.

As for the forum page, I think many topics have been started by our fellow students. I have seen some. I would say it is a good start/good effort considering the time factor and all that.

7 January 2007 23:20

By the way folks, myself and Mrs Chong , we are exploring the idea of communications with overseas schools…via blogs or video conference, podcasts etc. That might be something most of us might be interested in. We have an external vendor who will help us to do the link up.

For starters, we are thinking of sharing OCIP ideas with people from other countries (e.g. UK, Australia, HK) who have had similar experiences. Other ideas are issues on Global Warming, the coming Olympics in Beijing or anything you can think of.

Give me an indication or drop me a private message if you are interested then we can move on from there.

}

So thats about it. He did ask me some questions about how RSS worked and stuff. So I referred him to this post. Check it out if you have not.

January 13, 2008 Posted by | computers, education | Leave a comment

Computer Security Part 1 : History

This article will be published on NYX in their respective sections within a given time frame. For convenience, I have divided this entire article into 3 separate posts. As usual, I would advise that you read extremely long articles such as these off their custom page at NYX. The colour scheme there is more suitable for it.

The word count for this article at the end of the last section. I am afraid people will be scared off by it. If you want to learn something reasonable, just read the second section. Read the first and the last section only if you are more adventurous. Besides, the first section is nothing special as I practically ripped it off entirely from Wikipedia.

Article Start:

When we think of computer security, the words “viruses, spyware” immediately come to mind. Well, in most cases, these two forms of methods are the most common in terms of breaching corporate networks, causing inconvenience and intruding into personal privacy. In my article here, I shall focus more on viruses and worms since they usually cause the most amount of damage today.

This article is divided into 3 parts

1. History of viruses
2. Tips for end-users
3. Future roadmap of computer security

As such, let me dive into the history of the evolution of viruses and several key ones that redefined the landscape of computer security.

According to Wikipedia, the first virus that appeared was the Creeper worm in 1971. Its code was fairly rudimentary. All it did was to move from computer to computer displaying a message “Catch me if you can”. No damage was done and it died out soon after when another virus presumably written by the same programmer removed all traces of it in infected computers.

With this, it started the ball rolling. The next milestone to be encountered was how to spread a virus quickly and remain undetected in the process. This was milestone was reached in 1986 with the “Brain” virus. It copied itself to the boot sector of any removable storage media such as floppy disks. In this way, it could spread very quickly as floppy disks were the most common modes of data transmission in the 1980s before the advent of the Internet. It was only detected by users when it begun to slow down their floppy drives.

Up to this point, computer viruses were usually mild and were largely confined to a small local group of computers. All these changed when the use of the Internet became widespread. The Morris worm was one of the first to utilise the power of the Internet to propagate itself. Since its inception in 1988, it was estimated that about 10% of the world’s computers were infected then as a result. Computers that were infected began to slow down dramatically as the virus began to rum multiple instances of it within infected systems.

As you can see, the viruses in this 1980s era were focused mainly on attacking the operating system (OS) itself rather using personal documents as a means of propagation. The “Melissa’ virus in 1999 was the first macro virus that appeared. It marked the shift away from attacking the OS directly to infecting Word documents. Such documents were previously seen to be safe as they contained no executable code. Anti-virus companies were shocked then about this new mode of transmission.

It turns out that the Melissa virus used the macro features of Microsoft Word as a method to execute its code. The virus (worm) soon began to clog up email systems as it sent itself to multiple computers.

(A macro is a set of instructions embedded in a document. Its primary purpose would be to automate certain processes during the usage of the document. These instructions would usually be run during the opening file stage.)

However, all these viruses up to this point were working as individual stand-alones. Yet the use of power of these machines in the form of distributed computing to commit spamming attacks was not utilised. The Sobig worm which first appeared in 2003 made used of this form of attack when it managed to compromise hundreds of thousands of computers. The effect of mass spamming from these computers crashed many corporate server systems worldwide.

The Blaster worm also dubbed “lazy” worm by some was created to exploit not a new unknown vulnerability, but rather, a known one. In fact, the loophole (RPC service) exploited by Blaster was already known by Microsoft and a patch created one month before its inception in August 2003. As a result of the Blaster worm, the tech community coined a new term which is the 30-day attack window and subsequently the 0-day exploit. It signified the vulnerability window between the discovery of the flaw and point where all affected systems are finally patched up.

The Blaster also combined several key technologies like buffer-overflow and denial-of-service (DoS) attacks. The DoS attacks were focused relatively successfully on the Windows Update site. Fortunately the real site was of a different URL then the one targeted by the virus and Microsoft escaped unscathed.

(Buffer flow is a type of programming error that occurs when an (physical) input to a program is unexpected (large). This may cause the program to crash as it cannot handle this information. The most famous example of the buffer-overflow error occurred in 1996 when the European Space Agency’s Ariane 5 501 rocket crashed shortly after takeoff. It was found out a 16-bit program could not recognise and use a 64-bit input.

DoS attacks are usually focused on servers hosting websites or corporate VPN servers. It is usually done by first infecting a huge number of computers. Then at a preset time, these computers will attempt to load or access a public server on the Internet with a specific request such as loading a webpage. The server being unable to handle the heavy load will crash, bringing down other mission critical tasks it may be required to do.)

Now to recent times – the Storm worm. It started spreading since January 2007 and it continues to do so today albeit with much less potency. It set the record for the most number of computers infected at any one time. Experts put that number between 1 to 50 million computers

It combined many modern methods like DoS and distributed computing also called a botnet. The Storm worm uses peer-to-peer technology to communicate and divide tasks among other infected computers. This technology is similar to file sharing torrents where no central server leads the botnet.

(Bots are individual workstations being controlled to cooperate with multiple other bots to accomplish certain tasks. These tasks could be to attack a website, spamming or to seek out other computers to infect. A botnet is essentially the collection of all the bots.)

The bots operate as individual entities. When they contact each other, certain computers (usually the faster ones) automatically assume leadership positions to dictate the tasks done by the slower computers. When one bot leaves the network, there is always another bot available to take over its job.

It is stunning to note that the virus code in all of the bots are generally identical and this form of leadership AI can arise even when everybody is processing the same instructions.

The Storm botnet also employs certain defensive strategies to ensure its survival. One of which is the DoS attack. All bots are constantly on the lookout for any computer that attempts to detect and eliminate it. Once detected, this bot will immediately notify the leader it is associated with. The leader will analyse the severity of the attack and mobilise a calculated number of bots to “attack it” using the DoS technique , enlisting the help of other leaders if necessary.

Next, Part 2: Tips for End-users

November 18, 2007 Posted by | CCA, computers | Leave a comment

Com Security Part 2: Tips for End-users

Back, Part 1: History of Computer viruses

Given the nature and the number of computers affected, it could be very likely that the computer you are using to read this could be infected by this worm. So you may be wondering, what tools do we the end users have to deal with such threats?

I will be answering this question in this section.

For end-users, just remember 5 things. Firewall, Antivirus, Constant Updates, Alternative software and Common Sense.

Firewall

A firewall is a software or hardware that filters out any unauthorised data from passing between two networks. It is usually considered the first line of defence for most computer (networks) as it prevents infection in the first place. As prevention is always better than cure, I will be elaborating in detail about its importance and usage.

A firewall is a must have for any computer system/network that is connected to the Internet today. Today’s viruses and worms do not even need user-intervention to spread itself. An example would be the Blaster worm I mentioned earlier, it can infect an unpatched/unprotected PC connected to the Internet even without the user doing anything on it.

According to the SANS institute as reported by ZDnet News, it takes just 20 minutes to compromise a computer. And take note, that was in 2004, there is every reason to believe that this time frame has gone shorter by now. Even with the most seasoned techies, a firewall is still a must.

Thankfully, every computer installed with the Windows XP Service Pack 2 or later has an inbound-based firewall turned on by default. That perhaps can give users a basic sense of security. To check if yours is turned on, see the instructions from Microsoft here.

Its good to note that the built-in firewall provided by Windows XP does not protect well against outbound transmissions. That is to say, the moment a virus manages to get into a system, there is nothing to stop it from sending out information such as credit card numbers back to its creators. Microsoft claims that Windows Vista has the ability to control outbound data. But so far, an independent tester has found it unreliable and unintuitive to use.

Therefore, the best solution is, get a third-party firewall for this task. The several common ones include Zonealarm and Comodo. This two are free for Home use.

The main feature that distinguishes most third-part firewalls and the Windows Firewall is the ability to manage outbound connections more efficiently. The problem is, outbound filtering requires active user intervention for it to work effectively. The user has to individually approve the outbound access of all programs that wants to do so.

Very often, the user is shown just the process name without any additional information. The user being to fed up with the constant questioning may simply click allow at every question or worse still, turn off the firewall altogether. This is obviously is not the correct way. The firewall has no way of determining whether the program is legitimate as there are billions of programs in the world. Its up to the user to seek out and search the name of the process online and hopefully make a correct decision.

It is not common knowledge that routers are actually a form of physical firewall in itself. Due to the nature in which routers operate, they automatically filter out any unrequested information. An added feature is that the router’s programming resides on a firmware in a chip, this makes it even more unlikely that it can be corrupted by a rogue program. DoS attacks are also limited as any attempt to bring down a system is mainly focused on the external entity (router), the internal entities (computers) remain secure.

The price we pay for increased security is reduced convenience.

Antivirus

As its name suggests, it searches for viruses in a computer and alerts the user if one is found. Take note, up to this stage, the virus has already infected the system so the antivirus software is essentially a cure.

Antivirus software work like a security camera behind a locked door (firewall). When an intruder enters the house, it will attempt to identify the threat and eliminate it. Its function is fairly basic thus there is no need to explain much here.

The most common free antivirus software is AVG by Grisoft. There are paid versions too like Norton Antivurs from Symantec and Mcafee VirusScan. I will not participate in the debate between the quality of paid and free versions as I have not used paid versions . Your best bet will be to do read online reviews and ask around for advice.

When getting such software, always ensure that that it offers On-Access protection. This feature allows the software to scan all the computer processes for rogue activities as they are being used. Some software such as the Clamwin Antivirus does not have this feature and requires the user to manually scan a file(s) for viruses

There are prepacked solutions that combine all the relevant software into a single package. They are usually cheaper than purchasing the software individually. Symantec has the Norton Internet Security that includes the AntiVirus, Firewall, Antispyware and Antispam etc.

Constant Updates

This is not a third line of defence. It is a must-have in this age of rapidly evolving threats. Your security software have to evolve in tandem to keep up.

First and foremost, ensure that the automatic update feature in Windows is turned on. If not, make sure you visit the Windows Update Site regularly to install the latest patches. Do take a look at the software updates section at the site from time to time.

For firewalls, the need to update is less urgent as firewall technology has stayed relatively constant throughout the years. Any new feature is usually only available in newer versions of software. But that does not mean you should totally neglect it, update when the software tells you to.

For antivirus software, no updates equal negligible defence. The antivirus software needs up-to-date signatures to detect the threats of today. Always ensure that the automatic updates feature by the antivirus is enabled.

Alternative Software

This is another method to avoid certain threats altogether. For this part, I will be elaborating only about Web browsers and operating systems (OS) as hackers mostly target flaws in these software

It is a given that most Internet threats today target Internet Explorer (IE) over other browsers such as Mozilla Firefox and Opera. As such, in order to keep safe, users have to constantly patch this software. But patches are only for protection against yesterday threats, who is to say a hacker will not exploit a yet undetected flaw?

Here is where alternative browsers come in. Other browsers with their much lower market share will usually be ignored by hackers. Thus, they tend to be safer. A good start would be Mozilla Firefox currently in its 2.0.0.9 version. The downside being that certain features may not work on alternative browsers. The most common being the ActiveX control which is only available on IE. Its helpful to note that ActiveX itself is actually responsible for many security loopholes on IE.

For OSes, its much tougher. This entails switching to alternative platforms such as Linux and Apple. Apple’s Mac OS can only be used on their own proprietary designs. If you want to use your current hardware, this leaves Linux as the only remaining viable option.

Linux is an open-source operating system that is usually available free-of-charge on the Internet. The installation and usage of Linux is still considered rather geeky despite its improved user-friendliness over the years.

And the problem of missing/different features becomes magnified here. You can expect all the third-party software such as games that you are using now to be incompatible with Linux. It is almost mandatory to use alternative software like Firefox and Openoffice for your day to day operations. This entails an extremely high learning curve which not many users want to overcome.

(Open Source- A software which has its source code easily available to the public to scrutinise for bugs.

Openoffice
– A free, open source alternative to the Microsoft Office productivity suite.

Linux actually has a limited ability to run Windows applications through emulators such as Wine and virtualisation software such as VMware. But these tools are usually slower and not all software can work on them with the same efficiency then if they are run on their native platform. )

Even hardware, due its small installed user base. Certain specialised hardware such as TV tuners, wireless adapters, scanners, webcams and biometric readers etc may not be usable if the manufacturer is lazy to write a driver for it on the Linux platform.

Because of this, most Linux users operate in a dual-boot environment. This allows them to switch over to Linux by a simple reboot if the need arises. The technique to partition a drive for installation space is HIGHLY not recommended if you just a novice user.

If you are really daring like I did, you can go ahead and take the plunge to install Linux. But thankfully there are some Linux distributions that allow to “test-run” Linux in the form of Live-CDs before installing it. These distributions do not touch any files on your hard disk. Any time you want to go back to Windows, just reboot.

Common conventional Linux distributions, Fedora Core, OpenSuse, (K)Ubuntu and Mandriva.
Common Live CDs, Knoppix, Puppy Linux, Damn Small Linux.

Common sense

Nothing is usually good when we are dealing with the human aspect of machines and technology. Computer security is also no exception. In every system, humans are always the weakest link. Common sense is supposed to be common, but surprisingly some people do not have it, probably due to the lack of knowledge.

Now is the time for me to preach with the ten commandments of computer security. Print out and paste in front of your computer if necessary.

Rule 1: Do not accept any attachments you receive haphazardly. Whether through email or instant messaging, when you click it, it gives the virus an opportunity to activate itself.

Rule 2: Do not deactivate any security programs you are running. Just because they are an irritant does not give you the license to take a chance without protection. A lock is only effective if it is used all the time.

Rule 3: More does not mean better. Having more that one security program of the same class may cause them to conflict and create unnecessary problems.

Rule 4: Stay out of piracy! Do not download any stuff such as games, screen savers from the Internet without first scanning them for rogue material. Buying pirated stuff from the pasar malam may get you more that what you bargained for.

Rule 5: Backup important data regularly.

Rule 6: Never give out personal information and passwords to anybody.

Rule 7: Change passwords regularly. Once every 6 months would be prudent. Make sure they are of at least 8 characters and are hard to crack. If you are afraid of forgetting your passwords, write them on a physical medium and store them in a safe place.

Rule 8: Be observant. If you feel that your computer or Internet connection is getting slower and slower, there is a high probability your computer is infected by a virus.

Rule 9: Key the exact web URL in the browser’s address bar. This prevent phishing where a hacker sets up an identical looking site to get personal information

Rule 10 : Follow the above rules but do not be complacent. Stay vigilant!

Next, Part 3 : Future Roadmap of Computer Security

November 18, 2007 Posted by | computers | Leave a comment

Com Security Part 3: Future Roadmap

Back, Part 2 : Tips for End users

Given the level of sophistication of the Storm Worm, it is safe to assume that the viruses of tomorrow will be more potent. Conventional methods that I have detailed earlier may not be sufficient to handle the threats of tomorrow.

It is like a constant evolutionary battle between predator and prey. Hackers will always find ways and means to evade the software companies. Software companies will always devise newer tactics to detect and eliminate these malware.

With the increase penetration of technology in most aspects of our lives today, the importance of information security is getting more serious. As you can see, I have used the word “information” in place of computer. That is the era of the future in computing. Computers are becoming omnipresent as we speak.

The concept of computers has also changed several times in the course of history from Charles Babbage analytical engine that occupied a train, to the room-sized mainframe of the US-made ENIAC and finally to the desktop computer today. The next transition will be towards the ever powerful wireless handhelds (eg handphones and PDAs) and Ultra-mobile PCs (UMPCs).

In the future, all our (personal) information will be stored on these machines. The financial and unaccountable costs associated to compromised information will be devastating. The importance of security can never be fully underscored enough. Thus steps are currently being taken now to ensure that we have a secure computing future.

Due to the increasing amount of viruses targeting the Windows platform today, many corporate IT departments are beginning to turn to the alternative platforms such as Linux and Apple to seek a safe refuge.

These two platforms had their roots in the UNIX system. The UNIX system has traditionally been viewed as more secure than the Windows environment due to their stringent policy of multi-user multi-privilege architecture. This is opposed to the DOS days (Windows roots) where the OS would automatically login as a single user with root (administrator) privileges.

With this level of competition, Microsoft is not resting on its laurels. It has continuously beefed up the security of its products. It has also introduced the multi-user NT line of OSes way back in 1993 in response to this competition.

With the proliferation of enforced multi-user separation, we will safe from viruses, or will we? No, we will never be. There are always countermeasures.

The era of zero-day attacks has just begun. A virus released in the same day after someone discovers a flaw, leaving system administrators and home users no time to patch their machines. The future looks bleak does it? The answer is no again. Software companies are writing increasingly sophisticated heuristics programs. These programs help to detect potential threats even before they surface. This is the only viable defence available so far to counter zero-day attacks.

What about seeking refuge in alternative software? Remember, the reason why this suggestion came out was because these platforms were largely ignored by hackers. Not necessarily because they are inherently secure in the first place. With their increasing popularity, it may longer be safe to rely solely on such software for security purposes.

Rule 9 may not be safe anymore. Website address are actually reference codes to IP addresses. For example, when you go to http://www.nanyangjc.org , it is actually referring to the IP address 210.193.3.125. The collection of such pairs is actually stored on only less that 50 DNS servers worldwide. By hacking into one of them and changing the pair assignment, hackers in theory can wipe a legitimate page out of existence and replace it with their own. So far, some have been brought down by highly-coordinated DoS attacks, but none has yet been hacked into.

Remember the transition I mentioned earlier about handphones ? Right now, they are quite immune as their processing power and software has yet to reach the level of desktop computers. But this is bound to change with the ever growing number of features available on them. Modern phones today can even run advanced operating systems such as Symbian and Windows Mobile. With more powerful features, the greater the propensity for hackers to target them.

What about wireless keyboards and mice? The long neglected area of security. Hackers armed with a portable RF receiver nearby can easily receive your keystrokes on your wireless keyboard without even knowing it. There goes your passwords and bank account numbers, Moral: Use a conventional wired keyboard or one with built-in encryption.

The mushrooming of Wifi networks is also a concern. Many people are using these networks to surf the net. And large numbers of them are doing so without applying any security settings. Moochers can easily login and surf the Internet for free. More technical moochers may even snoop data from the unencrypted airwaves.

Even encryption may not be secure now. The WEP and WPA encryption has been successfully cracked and tools easily available on the net. The only secure encryption system remaining on the WiFi platform is WPA2. But as with all other encryption standards, WPA2 will eventually be broken as computer processors improve.

There are many other possible security breaches in the technology realm, too much for me to state here. The future indeed looks uncertain. What is certain however, as our lives depend more on technology, the need for information security will grow ever stronger. The battle between the good and the bad guys have not reached its climax yet.

End of article

Word Count: About 4000

Back, Part 1: History of Computer Viruses

November 18, 2007 Posted by | computers | Leave a comment

Digital Dependence

This article also another for the PCWorld Community Voices. I initially wanted to post this on 18 August but totally forgot about it. So here it is. View the web version here if u wish, they are both exactly the same.

Start

As you go about your life today, reflect, how many electronic devices have you used ? Laptops, handphones, Blackberrys, Ipods? Now imagine, one day, what if these devices were removed from you and you still have to continued your life as per normal.

I will bet my bottom dollar that most of you will not be able to survive “efficiently”. This is the price we pay of being too reliant on such gadgets, and gadgets for being too converged. We see this devices as virtual extensions of ourselves into the realm of cyber communications.

The devices we carry today are now more sophisticated than ever before. Take the new Nokia N95 for example, it is can function as a pocket computer, GPS receiver, normal handphone, MP3 player and loads more. With such features, one would think, is there a need for me to carry another device. Herein lies the problem, having just one device means convenience. And yet at the same time, it is all too convenient to lose that one device than to lose all 4. When that happens, we “cannot survive” as long as we are unconnected, or thats what conventional wisdom implies.

But one has to admit in this growing of technologically-advanced societies world-wide, accurate information has to be obtained fast and immediate. The astonishing pace of information that sometimes reach us may possibly even “overload” us. But ironically, some people like it that way, some even arrogantly claim that if we do not have access, to ask us to prove a way to succeed in life.

Now I’m not advocating that we should throw away our handphones and laptops, my point here is that all of you should sit down and reflect upon what is really necessary in our lives. Is it really important to stay connected 24/7? Or should you allocate more time to your neglected families and personal hobbies?

How I did I know this? Because I lost my handphone for an entire day before finding it back. In that span of 12 hours, I felt emotionally lost as I lost my ability to communicate with my friends who were physically out of sight. Thus, I realised all these. The invention of digital communicative devices, has made our lives much more complicated than ever before. And most of all, made us addicted to the taste of increased communication.

September 8, 2007 Posted by | computers, Science and technology | Leave a comment

Obsolescence

This is the first article I wrote for the PCWorld Community Voices. Here is the exact link to the article.

Ever felt that the electronic gadgets today simply do not last as long as their older counterparts? Or when you are kept bombarded by software companies to keep upgrading? Why should this be the case?

The former is called planned obsolescence, the latter tech obsolescence.

Im sure many people in the IT industry are familiar with tech obsolescence. For those who are not, let me explain. It occurs when an older technology is sidelined for a newer version even if the former is still in good working condition.

Planned obsolescence happens when companies build a product to last for a fixed finite time. Usually this is “accomplished” by cutting corners and relaxing quality control.

Lets take a look at many software companies today. Many, usually the proprietary ones like to come up with something “new” every couple of years. This is especially true for one particular operating system maker. Claiming that its newer product incorporates better technology and is more secure. This despite many teething problems still existing even up to now. And to “force” users to migrate, it has decided to stop supporting its older versions in due time.

This is not limited to just software, hardware makers should not be spared this accusation. When some products fail mysteriously after a couple of years or even months, one may wonder when did he ever mistreat his product?

So what do we consumers have to do? Well,the solution is simple, turn to alternatives. Eg, Open source(free) software, they tend to run on most kinds of platforms even the ancient ones. For hardware, turn to manufacturers who have a better track record. Difficult to accomplish this though, but can be done.

We the consumers have the power to decide with our wallets, we cannot allow any software/hardware company dictate our decisions. We choose, not them.


The blogger is a 16-year old college student from Singapore. He has always been trying ways and means to “legally” break past any computer/network systems he encounters. But he currently lacks skills in any programming/script languages.

August 12, 2007 Posted by | computers | 2 Comments