Green technologies
A continuación se explican algunas de las tecnologías verdes que más se utilizan:.
data center
An adequate design of the data center is very important, since it is where all the infrastructure supporting the various computing services is housed, and an adequate structure will allow good energy, space and cost savings in the medium and/or long term; Each company must choose the design that is appropriate for its own company, this is not a strict procedure, but rather good practices in data center design.
Looking to reduce energy you can start with the simplest action, which is to turn off the equipment that is not being used. Currently some computer cluster management systems, such as Moab Archived March 16, 2012 at the Wayback Machine. or SLURM, are incorporating energy saving mechanisms to allow idle nodes to be turned off and turned on again when the system load requires it. In addition, there are other types of systems, such as CLUES, that allow energy saving policies to be incorporated independently of the cluster management system.
Another issue to consider is the reduction of hardware, this consists of carrying out a study of the percentage that is actually used of each computer equipment, where, according to IDC, only approximately 15% is used, thus, once the result of the study is obtained on each computer in the company, those that have little use can be grouped into a single computer - unless the particularities of each service do not allow it.[12].
Another important aspect is to consider the possibility of relocating the data center somewhere that offers energy reduction or better use of renewable energy, as Google has done, which has relocated its data centers near hydroelectric plants to make the most of this source of energy and reduce its costs. In Microsoft's data center in San Antonio there are sensors that measure all energy consumption, use internally developed energy management software called Scry, have large-scale virtualization, and recycle water used for data center cooling.[13].
Likewise, the implementation of architecture-oriented software can help improve the performance of the application hosted in the data center. According to IBM, each watt of power in an application that is on a server is supported by 27 watts of power associated with the approximate support in the data center, in terms of information backup, storage and others. The more efficient the application is, the less its impact on the hardware will be, even without the use of virtualization.
Another important consideration is technology for saving space and energy in storage. In a study carried out by NetApp (storage technology vendors) with researchers from the University of California at Santa Cruz, it was found that 95% of the files stored in two large companies were opened only once in four months. This study confirms that a large proportion of stored files are rarely used, and together with the idea that storage can remain offline, there are ideas for developing techniques that allow us to use less energy. In this aspect is the MAID (Massive Array of Idle Disk) technology whose disks turn off when they are not active.[12].
According to a study by Sun Microsystems,[14] data center trends are:
- Data center consolidation will lower operational costs.
- The cost of operational energy for servers will exceed their own cost in the next 5 years.
- More consumers will adopt the use of thin clients.
- More applications will run outside the data center such as Software as a Service and Social Networking.
- The automation of computer centers advances.
- Memory and I/O bottlenecks will be the next capacity issue to solve and
- Design of modular data centers.
Virtualization
Virtualization is a technology that shares computing resources in different environments, allowing different systems to run on the same physical machine. Creates a single physical resource for servers, storage, and applications. Server virtualization allows multiple servers to operate on a single physical server. If a server is used at a percentage of its capacity, the extra hardware can be distributed to build multiple servers and virtual machines. Virtualization helps reduce the carbon footprint of the data center by reducing the number of physical servers and consolidating multiple applications on a single server, consuming less energy and requiring less cooling. In addition, a higher resource utilization rate and space savings are achieved.[15].
The trend toward virtualization in the United States began with the power generation crisis of 2006. Research showed that energy consumption would increase by 15% to 18% each year, while supply by 6% to 9% annually. With virtualization, companies managed to reduce their energy consumption, reducing costs and at the same time their damage to the environment.[16].
Gartner estimates that global revenue from virtualization will increase by 43% from $1.9 trillion in 2008 to $2.7 trillion in 2009. Global virtualization penetration will reach 20% in 2009, up from 12% in 2008.[17] In Latin America, it is estimated that the implementation of virtualization increased by 30% during the year. 2009.[18].
The adoption of virtualization is driven by the need to reduce costs, increase the speed of application deployment and reduce the impact on the environment by reducing the carbon footprint of organizations.
Client/Server
The client/server environment sometimes referred to as a thin client keeps software, applications, and data on the server. Information can be accessed from any location and the client does not require much memory or storage. This environment consumes less energy and cooling.
To earn EPA Energy Star certification, computers in idle or sleep mode must consume no more than 50 watts. Nowadays, equipment that consumes less energy is needed and high energy performance computers such as Fit PC and Zonbu PC have already been developed, with enough capacity to run an operating system, but so compact that they only consume 5 watts. Companies such as Sun Microsystems have also developed thin clients, Sunray which uses 4 to 8 watts because the processing activities are performed on the server.
An interesting fact is that in one day, these devices consume less energy than a traditional computer consumes in an hour.[19].
Thin clients along with virtualization will significantly reduce power consumption. According to Gartner, if the user interfaces of all personal computer applications were virtualized to a thin client/server model, IT overhead costs would be reduced by 50%.[20] Likewise according to Dr. Hartmut Pflaum, a Fraunhofer researcher, while desktop computers consume around 85 watts on average, thin clients including their servers use 40 to 50 watts. If the amount of energy used by ten million personal computers in companies were reduced, 485,000 tons of carbon emissions per year could be reduced, as well as savings of 78 million in electricity costs.[21].
Computer networks
Network computing is the application of a set of computers to a common problem at the same time, usually for a technical or scientific problem that requires a large number of processing cycles or access to large amounts of data. It is a distributed form of nodes that is composed of a cluster of coupled and connected computers acting together to solve very long tasks, usually used for computationally intensive problems, usually scientific, mathematical or school problems.
Computer networks make it possible for multiple institutions to collaboratively combine their resources to solve computing-intensive problems; in recent years computer networks have moved to the adoption of Service-Oriented Architecture (SOA). This is confirmed by Goble and De Roure (2007)[22] who say that the ubiquity of SOA is a driver in the research of more agile solutions in the scientific field and industry.
Computer networks are changing their posture from a simple super machine that resides within the data center at a specific institution and moving to a collection of geographically separated computers.
Cloud computing
Cloud computing is a form of distributed computing that provides its users with the ability to use a wide range of resources on computer networks to complete their work.[23] Resources are dynamically scaled and provided as a service over the Internet. Users do not need knowledge, experience or control of the technological infrastructure.
By using cloud computing, companies become more ecological because they reduce their energy consumption by increasing their capacity without having to invest in more infrastructure. In addition, the hardware utilization rate is increased since resources are shared.
Telecommuting
Defined by Merrian-Webster as working at home with the use of an electronic link with the central office, teleworking makes it possible for employees of an organization to stay at home and do their work without having a presence in the office, by not going to the main office, there is a reduction in the amount of gas used by the employee, which results in less pollution due to taking at least one car off the road per day.
Companies can achieve a reduction in their carbon footprint in different ways. The first of these is that the company seeks to implement an ecological initiative within its data centers or within its energy consumption. Other ways to contribute to reducing the carbon footprint are by taking advantage of technologies. An example could be teleworking (telecommuting) as it reduces the gas consumption used by the employee which results in less pollution. A report issued by the American Electronics Association (AES) found that 1.35 billion gallons of gasoline could be saved if every U.S. worker with teleworking skills did not commute to remote offices 1.6 days per week. In addition to helping companies reduce their carbon footprint, teleworking can also be used as a seclusion and retention tool. A recent study of more than 1,400 managers showed that 1/3 find teleworking to be the main incentive to attract the best employees and nearly half of the rest think it is their second best incentive after economic benefits.[24].