Skip to main content
Main Menu

The perfect storm - time to review client computing strategies

Friday, February 27, 2009
  • Organisations are faced with making decisions that will impact their approach to IT for the next decade
    Ad hoc decisions can no longer be supported - IT decisions made now will have to be strategic, creating a flexible platform for IT and the business for the long term.
  • The business has to be in control - IT is just an enabler
    For too long, IT has been a constraint on how many businesses have been able to work. Now, IT has to be seen as providing the facilitation required for the business to excel.
  • The PC can no longer be regarded as the main client device
    There is an explosion in device types that is driving the need for a more inclusive approach to application, function and data access.
  • The argument is not between the use of a "thick client" PC or a thin client
    With the advent of new form factors, such as the iPhone and other intelligent mobile devices, the real argument is about providing a consistent platform for accessing functionality.
  • But some users are better suited to a thin client
    Task workers (e.g. those in contact centres) and tethered users (those who spend most of their working time at a specific desk, generally seen as office workers) are best served by a thin client using a served image. Occasionally disconnected or highly mobile users will require a thick client - but server-based controls can still be applied.
  • Data security has become a major issue
    By placing information on central servers, a greater amount of control can be applied on it. If data is not stored on the end user device itself, the value of the device is reduced to the cost of replacement only.
  • Business continuity is a major driver - server-based computing makes this attainable for the individual
    Disaster recovery is a practice that is being overtaken by the necessity to keep users, departments and organisations working continuously. By using data centre techniques to maintain availability of server-based images, failure of an end user device is not a problem - the user can just log in from a different device to access their existing image.
  • Centralised server-based images will improve Green credentials
    The average thick client runs at around 40W continuous power usage, and is used at around 4% of its resource utilisation. A thin client can be run at less than 10W. Server-based images can make optimum use of a virtual server farm, providing strong overall savings in power usage.
  • Asset lifecycles can be extended, creating further savings
    As the main workloads are moved from the client to the server, the client no longer has to be upgraded to keep up with changes in operating system and application requirements. Therefore, the average lifetime for a client can be extended to be more in line with its mechanical life. At the server side, virtualisation allows workloads to be better served from existing assets, so extending the life of servers.

Conclusions
Server-based computing is nothing new, but past issues have held it back from general usage outside of certain defined cases. However, with the growing maturity of the basic tools, combined with the strong emergence of platform technologies such as virtualisation and streaming, now is the perfect time to review the position of the desktop and other devices within the organisation.