jump to navigation

To Core i7 or Not? Just Go for a Core2 Quad Q9550 April 29, 2009

Posted by fvter in Hardware, Technology.
Tags: , , , , ,
1 comment so far

Today, I started to notice some weird fan noises coming from my Home PC that we use for general all around work and also for playing PC based games. So it made me realize that my want to move to better hardware may just have to be done sooner rather than later. Now as much as I would prefer to move to something like a MacBook Pro, I don’ t have the budget to undertake that kind of move.

Thus I started to investigate the possibilities of upgrading the Hardware to move to either a Core2 Quad platform or even a Core i7 platform. To be honest, the upgrade which ever way I go would require a motherboard and RAM upgrade on top of the CPU. Also I more interested in going the Quad way to be able to better multitask :- i want to be able to watch or even edit multimedai all the while playing World of Warcraft.

Visting my favorit parts supplier in France, I noticed that the price of the Core2 Quad vs. Core i7 was not that much different (around €50 to 75) but the killer price impact is the motherboard and the need to use DDR3 RAM. The comparison that was done involved trying to get the same basic hardware infrastructre with only the CPU changing. That means that whichever direction was taken, the number of ports, memory (going for 8Gb), I/O support, audio, etc would be an almost 1:1 comparison. References to the different parts are attached in the links section of this article.

Truth be told, I would much rather go with the Core i7 option as it would have a longer life span. Unfortunately it’s still an expensive option to go for and for the same price or less even, I could essentially walk away with not only the CPU/motherboard upgrade but also a brand new ATI 4890 graphics card. There is a whopping €225 to €275 difference which is not negligible an can’ t be ignored.

You got to hate having to make these kinds of decisions! Seriously, I wish that I had cash to spare…

Let me know you thoughts and/or comments through this article or via my Seesmic Profile

Related Links


«Sign-In with Twitter»: Should we be Scared? April 22, 2009

Posted by fvter in Rants, Security, Technology, Web.
Tags: , , , ,
add a comment

Last week, Twitter opened up it’s «sign-in with Twitter» open authentication or OAuth service under the radar. To be fair to Twitter, the news last week was more focused on the one million follower story and the arrival of big media names onto the service. Now, I’ve always been an advocate of using OAuth type services (I personally use OpenID as much as possible) to both simplify a user’s life and to avoid the problem of password re-use.

It also goes to Twitter‘s credit to move in this direction and to provide this type of service to ease the integration of external applications as well as make it easier for user’s to provide their Twitter information.

Disclaimer: I have not had the time and that’s not likely to change in the near future to fully investigate and examine the security of the Twitter OAuth service. The following rant is purely about Twitter‘s current public track record…

[kyte.tv appKey=MarbachViewerEmbedded&uri=channels/7802/412971&tbid=k_16&premium=false&height=334&width=319]

Twitter‘s public track record of securing and making a reliable service is less than top par. My top 3 frontal issues that have been discussed, re-discussed and overall made serious news for Twitter can be summed up with this list:

  • The service has a huge history of availability issues, well rather non-availability in times of high traffic although this hasn’t occurred in a while it’s bound to happen again seeing the growth patterns of late;
  • The security has a number of times criticized the continued use of basic-authentication (inc. accepting base64 password encoding) to use the service. The problem being that this is an easy way to grab the user’s password which would break or poke serious holes in the OAuth service;
  • There have been a repeat number of XSS attacks and worms including the most recent mikkey work which last over two weeks in its different iterartions.

These three points push me to think on whether or not I would be able to really trust such a service. Will I be able to use it at all times? Am I sure the authentication might not lead to a password leak? Am I sure that the OAuth won’t be replayable? Can I be sure that the OAuth session isn’t being misdirected or stolen somehow in XSS or via a worm? Makes me wonder if the service will actually provide a decent and safe mechanism for authentication and whether or not my credentials are going to be safe :- scary……

Related Links:

Old Posts Appearing in Feed… April 15, 2009

Posted by fvter in General, Rants.
add a comment

So you maybe seeing some old posts appearing that had not be published before. I finally got down to finalizing and editing the content of some older drafts that I think needed to be published just for the historical content.

I am hoping to avoid this situation in the future and should be able to keep a normal release schedule between draft status and published. Let’s hope it works!

To Vista or Not? Need a 64bit OS but Linux not an Option – Your Thoughts April 7, 2009

Posted by fvter in Hardware, OS, Rants.
Tags: , , , ,
1 comment so far

So I am planning on moving my home PC to a 6Gb memory base (and also moving to a Core i7). Because of limitations in a 32bit system for supporting memory over 4Gb, I am going to have to move to a 64bit platform. So the question becomes which OS should I use on this new hardware configuration?
The machine in question is used extensively for gaming and other «productivity» usages by all the members of my family. Let’s just say that Linux is not an option on this machine for many different reasons including the fact that a lot of games and apps that I use just don’t work properly…

So I need advice on which way to take the platform:

  • Should I move to Vista Home Premium? or,
  • Should I stick with Windows XP 64bit?

Your thoughts and opinions very much appreciate, please don’t hesitate to comment on this post or video replies to this Seesmic thread!

OnLive :- Thoughts and Ramblings April 4, 2009

Posted by fvter in Gaming, Hardware, Networks, Technology.
Tags: , , , ,
add a comment

During this weeks GDC’09, the OnLive service was announced and demoed. I can only really comment on this service based on the reviews and reports coming out of Joystiq, Gamespot, Engadget & others… The idea behind OnLive is to marry cloud computing with high end PC gaming. It is best described on the OnLive website How OnLive Works. Personally, I find this service intriguing and potentially a mini-revolution (might be a bit strong but that’s why the mini prefix) in PC gaming. It also has the potential to open the availability and introduce gaming to a much globaller audience who don’t have the buy-in power. It could also be a simple and interesting entry level platform for testing games before purchasing.

Much of the initial commentary coming out goes from amazement and how incredible a service like this would be to a yes but attitude and sceptism on the actually possibility and ability of the service to work. The main concern in current commentaries is the ability of the service to perform as stated due to a lack of network bandwidth and responsiveness. While I do agree that there a lot of challenges for this service to be able to get thinks working as smoothly as possible, my humble belief is that this service will get kicked off and have more than acceptable performance capabilities. One fo the reason I feel strongly is due to some of the minds behind OnLive. Steve Perlman, being on of those minds and one of the original technological minds behind Quicktime, has done a lot for streaming and has already provided some amazing solutions to optimize the interaction of the user and media across the Intranets [ed. note: I’ve had the chance to see Perlman talk in an Apple Dev. Conference and he knows what he is doing to be quite honest].

However, I need to disagree with the main focal points that a lot of commentary has taken. Much of the commentary has centered on the fact that they don’t believe the service will work because of the network performance. My rant here has a lot to do with the fact that most of these reviewers are making assumptions based on their current network experience which is mostly USA, Canada or UK centric. These assumptions are based on areas where ISP performance is average and not fantastic or where there are known (or suspect) ISP network controls and restrictions. Nigel Cooke on his recent Monkyenuts podcast (episode 7) bought about similar comments but with a touch of his own experience on optimizing and managing corporate networks. While I respect his knowledge on the subject, you can’t compare an Internet based service and network optimization approach to that of a corporation. Most coporate networks are based on a hub & spoke model which tends to lead to fixed route paths and a series of bottlenecks that hamper performance. The Internet being a much more meshed environment is constrained in this manner at least not until the last leg between the user and the ISP.

The problem with this overall line of thought is that it doesn’t reflect a reality of what the network can actual do (where I live, my two ISPs provide me with amazing performance with average latency of @400ms, @1100 kb/s down & @350kb/s up) and the potential that a service like this can do with a proper network environment, network optimization and more importantly the optimization of the compression & handling software. The comments also don’t take into account on the amount of advances that have been made in data center hardware and network advancements that have been made over the years especially by companies like Google that have learned how to make small footprint high-performance hardware and optimize the placement of that hardware to better serve the Internet.

Finally, some of my beliefs are founded on the fact that I have been involved over the years in projects where bringing distributed high-end pc computing over a network was successful. In a similar case, all graphics and manipulation was to be done on core centralized machines while the user would be provided with a web interface to manipulate the data and visualize the graphics models and displays.

So definitely a gaming technology to keep an eye on and potentially something bound for success. I for one would use this type of service to avoid the heart break of having to own multiple PCs or to continuously upgrade those machines!

Discuss this with me via my Seesmic Profile on this thread.

Article Links & References

A Friend’s Blog Got p0wnd March 17, 2009

Posted by fvter in Security, Technology.
Tags: , ,
1 comment so far

I spent a good part of today investigating a javascript injection that a friend of mine suffered on his personal blog site. It turned out that this is nothing more than a typical adbot/scriptjacking malware infection. The actual injection code is an obfuscated iframe that tries to download a HTTP browser attack tool. The code is inserted in the page build (usually via the wordpress function framework, the style-sheet or even maybe a rogue module) and looks something like this:

malicious javacode

malicious javacode

The obfuscation resolves to a call that pulls a source script from a website hosted at add-block-filter.info and by then tries to either retrieve stored passwords & cookies or hijack open webpages. More generally targeting e-mail services to send out spam ( your typical adbotnet behaviour).

Tracking back the domain name, it came back to a know malware pusher 7addition.info/8addition.org. So in most likelyhood a new variant of script injection attack whish is picked up & revealed a known trojan downloader javascript iframe infection (at least reported by a few AV vendors e.g: trojan-downloader.js.iframe.ah). In this case, the trojan goes on to contact 2 other malware sites at firstgate.ru & benyodil.cn whom in turn download 3 additional malware infections to continue the pownage:

  • a malicious flash file which is in fact a download exploit (e.g: Exploit.SWF.Downloader.ks);
  • another html based script which is fact a trojan download agent and also sends out spam asking you to visit a site or click on a video link(e.g: Trojan-Downloader.HTML.Agent.np);
  • and finally, a packer javascript html agent which installs a BHO (browser helper object) that turns off the firewall and other windows services (e.g: Packed.JS.Agent.ad).

That’s as far as I went with the malicious activity…

Before investigating, my friend and I exchanged a few messages regarding him being p0wnd. He was trying to figure out what had been the root of his infection. Although he blames it on a combination of Twitter/Hotmail and a few other sites, seeing the root of the malicious software that gets pushed I would say that he original got hit from visiting an already infected site or from clicking on some weird website with flash videos (he does love to visit those). Interestingly enough, I think I can track back part of his problem to the 13th of march or a few days before. At that time I received an e-mail from him that was unusual:

I didn’t really pay attention to it but maybe should have and warned him at that time of the possible hijacking of his info. He learnt a few things (like not using the same password for his different services). I learnt for myself that when I see a friend sending a weird message to me to get on the ball and warn him/her.
Some more advice I offered is to:

  1. Update with regularity his personal blog framework;
  2. Recommend also to be careful about using the remember me option on some of these websites as the stored cookies give these clickjack malware a fair bit of leverage.

In These Times, Can You Protect the Business From Insider Threat March 5, 2009

Posted by fvter in Security, Technology.
Tags: , , , ,
add a comment

This post & thoughts are a reflection on my experience and years of dealing with the problem of identity management and how to relate a user versus his roles and responsibilities in the IT infrastructure and how this affects the departure processes (or exit procedures).

As the economic recession goes into it’s darkest times, businesses are making the hard choice of letting people go. The IT organisation is typically an area were decision makers take the opportunity to trim the fat. However an important part of decision making process, that can be easily overlooked, needs to be a good understanding of the risk involved in letting go of certain categories of IT staff and how their roles and responsibilities can potentially create a serious exposure footprint.

Why would HR & the security officers need to establish this risk analysis? The simple answer is that businesses need to ensure that staff who potentially hold the keys to the kingdom are not irate when they leave. The risk here is that an irate ex-employee with key information to be able to access the infrastructure may be tempted to take action in frustration or revenge. This unfortunate (and let me be clear sometimes illegal) type of action potentially involves damage  that can range anywhere from serious data leakage to denial of services hampering a company’s ability to do business.
A few examples scenario of a departing IT staff’s role versus what they can do could involve:

  • A network engineer (remember the San Francisco city network incident) who has extensive knowledge of the network configuration and holds some of the common super-user password could place back-doors allowing him to later bring down the network, redirect traffic out of the corporate network releasing sensitive information, or even using the network as a way-point for other types of illegal activities.
  • How about a server system administrator who has local administrator access to boxes and can place a backdoor allowing for remote acces and thus the ability to grab information or even stop critical business applications.
  • But even more critical (at least from my experience) is surely a security engineer, the knowledge of the security profile and accesses that have been made available to that profile makes this the highest risk footprint. To do the job, he/she has gained knowledge that renders the infrastructure critically vulnerable.

So the question that begs to be said out-loud is can a company avoid any issues?

The real protection that a company can achieve is to have a comprehensive identity management process and tool. Identity Management [IdM] is about a lot more than just being able to determine who works in the company which unfortunately is the baseline thinking or the minimal implementation that gets carried out. It’s also about being able to link a person to his/her role and authorizations. A well implemented IdM process and infrastructure will ensure that a person in the organization has a well defined role. That well defined role will correctly identify his/her authorizations and access rights. The ability to correctly define those authorizations provides a safeguard and a well-defined means to not only properly implement an exit procedure but also help evaluate a risk profile based on that persons footprint in the organization. The well-defined profile will ensure that the user is correctly matched to the tools & resources required for the job: no more, no less. This same correlation can then be used in the exit procedure to quickly identify and revoke all accesses. There are of course many more benefits for day-to-day operations to a complete IdM environment but that may be the subject of an alternate post.

The simplistic answer or quick fix if a comprehensive IdM is not in place is to make sure that the person leaves on good terms. The important part is to evaluate the risk versus the cost versus the potential loss. Unfortunately that is a short term strategy and somewhat impractical.

Related Links

Using TweetDeck’s Features March 2, 2009

Posted by fvter in Technology, Web.
Tags: , , ,
add a comment

Following a pleasant feature review (or how-to) of TweetDeck by Cali Lewis on GeekBrief.TV ep.517, I figured it was about time to actually sit down and fully investigate the different functions in the utility and did so this weekend.
I’ve been using TweetDeck for quite a few months now from time to time (I alternate with Twhirl) and was only really using it with some of the default columns :- all friends, replied, directs.

From time to time I also used twitscoop and in rare occasions would also run a search. For what it’s worth that basic mode in itself is a very functional Twitter interface with the benefit of quickly allowing you to see the tweet feed plus tweets where you are mentioned. That’s about it for the TweetDeck features :- this post is about some new features rather than a review.

[kyte.tv appKey=MarbachViewerEmbedded&uri=channels/7802/411183&tbid=k_14&premium=false&height=334&width=319]

The function that intrigued me during Cali’s how-to was the groups ability. With it, you can group different Twitter accounts into one column. I follow a number of tech & general news magazines/webzines that use Twitter as a form of notification. The group functionality allows you to put these all together for quick identification and review. I set this up and a couple of others (tech products, security & close friends tweets).
After running TweetDeck for a few days like this, my conclusions are that it’s useful and interesting but… (the buts are related to the following two points):

  • you need a really wide monitor (on my laptop this is inconvenient as you spend much time scrolling in all directions);
  • it hasn’t given me and noticeable benefits in the way I look at tweets! The main reason for this is that I tend to speed read through most Internet chatter/info and focus on the points that catch my eye or raise a flag. I can do this quite easily and efficiently in the general all friends feed. This however could be a side effect of the lack is screen realestate.

I’ll continue to use these features especially on my gaming rig with the big monitor where it will give me a better vision of things. On another note here, it would be nice to have a save/transfer settings feature.

I am still stuck on one point with TweetDeck and that is that I am unable to find and easily follow a new Twitter. I am sure it is there somewhere but just not that obvious (at least from my PoV). It will be interesting to watch it evolve.

Related Links:

Halo Wars Demo – Hits my Sweet Spot February 8, 2009

Posted by fvter in Gaming, Technology.
Tags: , ,
add a comment

At the end of the week, the Halo Wars demo was released on the x-box live service. As soon as of hit the Europe servers, I jumped onboard.

I admit that I have always enjoyed a good RTS (real-time strategy). They tingle my logic neurons and I enjoy having to sit down plan and be tactical in order to achieve a goal unlike FPS type games which are more fast action pace & brute force.

[kyte.tv appKey=MarbachViewerEmbedded&uri=channels/7802/410644&tbid=k_12&premium=false&height=334&width=319]

Surprisingly, Halo Wars brings a good balance between the need to be tactical and the fast pace of an action game. Building and creating your force-de-frappe has a good feel to it as you can get things up and running quite quickly but you need to plan ahead to be able build more advanced units. Combat is fast paced and easy to jump into. The parts I played on the demo were very focused on achieving goals and the intermingling of the Halo Wars storyline (including some amazing cut scenes) made it attractive and enticing to move forward in the game. Now I did not get a chance to play the online team mode but I’ve heard that it is quite an experience as well.

All-in-all Halo Wars looks like it will hit a sweet spot :- a must buy!

Related Links

200 days of Wii Fit! January 30, 2009

Posted by fvter in Gaming, Personal Status, Technology.
add a comment

So a few days ago, I hit 200 days of using the Wii Fit. I try to keep to a pretty regular schedule that is to say that I at least do the wii fit test every morning. Actual exercise is a bit different since, I am unable to do it on a very regular basis due to time constraints or just plain physical pain with my ankle but I try to get at least 3 to 4 days during the week of the aerobics and muscle mixed in with a bit of yoga (for the stretching).

Overall, I find the Wii Fit as a good motivator and a useful tool to be able to get some regular exercise. My only complaint might be that the presentation of exercise although good don’t necessarily allow you to keep a good and proper posture to execute them yourself. This is especially true on the yoga parts…

Despite that, I am quite content and I have an average loss of about 5kgs since starting! Not bad hee!

[kyte.tv appKey=MarbachViewerEmbedded&uri=channels/7802/331694&tbid=k_4&premium=false&height=445&width=425]