I wondered how come my computer at home showed a windows update and it wasn’t Tuesday anymore. It appears Microsoft has found another RPC exploit in it’s Server OSes (2000, 2003, and even XP with the latest SP2 or SP3 installed. Susan Bradley from Windows Secrets posted about this today and it’s worth a read to keep yourself protected.
The Mono Project has just released its latest version 2.0 allowing .Net developers to test and port (sometimes with little or no coding changes) so that they can be run on other platforms like Linux and Mac. The lastest version of Mono is compatible with server and desktop version of the 2.0 version of the .Net platform. This will now allow software developers and software houses to reach a wider range of clients since they can develop software using Microsoft’s architecture that will run on different platforms.
With Intel introducing their 45nm Xeon processors, which now have up to 6 cores, servers can now scale up to 16 processor sockets which means 96 cores on a single server.
The exponential growth of the Internet, the accessibility of information and the expected richness (real-time streaming video) means we need more powerful servers to cater for our needs. More servers mean more physical space, and in turn mean more power being used. But with the new processors they pack much more punch but emit less power. Less power also means less heat emission from the processors which require less cooling.
Virtualization is becoming the preferred way for hosting these days and by utilising more powerful servers with more memory and disk space; multiple copies of server operating systems can be installed on a single machine. Packing more virtual servers on a single server enables us to use less physical space and power consumption than we normally would. Tests have shown that for virtualization this type of setup delivers almost a 50% improvement in performance and up to 10% less power consumption.
With Microsoft Server 2008 and true hyper virtualization using processors from Intel (and AMD) true virtualization is now ever more possible. Now it’s not only like likes of Amazon with their EC2 cloud-comuting server architecture but smaller companies too and ISPs that can offer virtualized servers for their own and clients’ use. The Geeks at How-To-Geek have got an aritcle on this very topic explaining hyper-v and virtualization in more detail, read on and find out more and see what true virtualization has to offer and see how it has now become possible to have hyper-v solutions with up to 2TB of physical memory and 64GB of memory per virtual server instance!
If, like me, you’re still using Windows XP you may already have SP3 installed. Others of you out there might still be skeptical about installing it especially with the multitude of problems people out there have encountered.
I’ve been running Windows XP SP3 on my home and work desktop since the end of May 2008, including my work laptop since the beginning of August. All 3 systems are very stable and have been so since the completion of the service pack installation but not without a few problems.
Desktop machine at work was installed by the IT guys before I joined and the did some sort of magic to raid the two hard drives without using a RAID controller. Basically forcing Windows Server raid on a Windows XP Pro OS. This was fine for the use of the machine while it was running SP2 the 6 months prior to installing SP3. After installing SP3 however the machine blue-screened almost immediately after the install completed and a reboot was required. No matter what trick I tried to fix it (and let me tell you I just about tried everything you could find through Google) nothing worked until I gave in and reformatted the boot drive and did a clean install. Straight after doing the clean OS install I did the SP3 install and since then things have been fine.
The work laptop was also a clean XP install with SP3 straight afterwards without any problems – I’ve been using the laptop for the last 6 weeks doing development without any problems.
My home PC had recently been rebuilt since I upgraded most of the hardware so it had a clean OS but had been running various software, networking and development tools with SP2. I thought, well lets give it a try here too – worst case scenario was that I’d need to rebuild it again. So I installed SP3 and first thing that didn’t work after the reboot was my internet connectivity. After a lot of digging and back-and-forth communication with my ISP tech support as well as Billion’s tech support I managed to fix the problem (mind you not without installing and uninstalling SP3 three times before figuring out what it was).
Somehow, somwehere, my network drivers had gotten corrupted under SP2 – but strangely with SP2 they worked fine, which led me to believe that SP3 was the culprit. Eventually I uninstalled SP3 for the third time and decided to remove all networking components from the Device Manager and reboot the machine. Of course upon reboot it re-detected the hardware again and re-applied the drivers for the newly found networking components. After a quick check that everything was working, I reinstalled SP3 for the fourth time. This time after a reboot everything worked as it should have. And a little over 3 months later the desktop machine at home is still stable.
I’ve never been a fan of software that forces downloads down my throat, especially without my express permission, which is why I’ve always set Windows Update and any other application, game, etc to NOT download updates automatically. Why, because I’m a firm believer in “if it ain’t broke, don’t fix it”.
And this has SO much been the case with just about any Microsoft product ever since I can remember. How many times have you experienced or read about an update (or even a service pack) ruining machines or causing more problems than there were to begin with after they were installed? Countless!
As a South African we’ve not really had the pleasure of bandwidth (cost being the biggest issue) so I, like many other South Africans, until recently, had been a dial-up user up until a few months ago. Have you ever tried downloading big updates (never mind trying a service pack) using dial-up? But as a newly converted ADSL user, bandwidth is still a cost issue for us South Africans, so my Windows Update settings are still set to Check for Updates (no updates are downloaded and you’re notified that updates are available for download and install). As a tech-savvy computer user I keep up-to-date with the latest news and know most of the time where there are vulnerabilities in my machine and decide myself whether or not a certain update is required or not. I don’t have unnecessary services running and don’t use certain Windows components on my home machine so I’m comfortable deciding when and what Windows Update downloads and installs on my machine.
And now on to the crux of the matter, it appears that Microsoft has for a long time decided that as long as there is an Internet connection available Windows Update will update itself whenever it wants to, without your knowledge or approval. Which in my mind means Microsoft has imposed on us Spyware (interestingly Microsoft’s Malicious Software Tool seems to ignore it).
Scott Dunn has been a contributing editor for PC World since 1992 and he’s also supplying articles for the Windows Secrets website. There’s a great article explaining in more details how Windows Update actually works and how to tweak it to your needs.
If you like being organised like I do, you probably use to-do lists, personal and work-related ones. Well if you do and want to know some of what is out there, or want to see if there’s another one you like and might use instead then LifeHacker’s latest post on Five Best To-Do List Managers will help you. It showcases 5
Personally, I like to use Outlook’s tasks and calendar to manage my daily reminders.
For as long as I can remember there have been websites, emails and books on various ways to optimize your version of Windows, be it through configuration changes, registry hacks or shutting down various services.
But many times over various “optimization techniques” get re-used and re-published seemingly to improve the performance of your machine yet many actually have no impact at all, a negative impact on performance or where done by users not entirely sure what they’re doing, causing total system crashes.
On How-To-Geek there’s a great article summarising and revealing some of these so called performance enhancers as myths. There’s a link in the article above to a few articles on LifeHacker with further information about this issue.
Ask around if you’re not sure before you attempt to “enhance” your machine’s performance to avoid unnecessary inconvenience.