Monday, June 29, 2009

Greenhouse Effect & Global warming

Nowadays we can often hear the term 'Greenhouse effect'. Global warming seminars and conferences are being conducted everywhere. Each and everyday we hear debates about these issues in News reports and other media channels. So, actually what is global warming (or Greenhouse effect)?? And why there is a growing concern about these issues??

Greenhouse effect and Global warning are related to each other. Actually, Greenhouse effect is a phenomenon that leads to Global warming. The trapping of infrared solar radiations by earth's atmosphere is known as greenhouse effect. Earth's atmosphere acts as a greenhouse. Actually it’s the atmospheric gases, mainly carbon dioxide, which traps the infrared radiations. This phenomenon keeps our earth warm and prevents it from cooling excessively during nights. It works in the same way as a Green house.

Earth's atmosphere allows small wavelength (high frequency) infrared radiations that are coming from sun, to pass through it. These infrared rays pass through the atmosphere and heats up the earth. When earth gets heated it also starts emitting infrared radiations. Since earth is less hot than sun (many many times), the radiations emitted by it are of large wavelength. The atmosphere does not allow the larger wavelength radiations to pass through it and reflects it back to earth. These again cause heating of earth surface. This whole process is known as Greenhouse effect. Carbon dioxide is a greenhouse gas. More concentration of CO2 in atmosphere will lead to higher rate of infrared ray trapping, which in turn will increase the temperature of earth. This will lead to Global warming. Hence due to Greenhouse Effect the temperature of earth is constantly increasing (Global Warming).

Thursday, June 25, 2009

Steady State Theory

In this post I will try to explain about Steady state theory. It is the less popular one. Although it has many flaws and discrepancies, still it is the main competitor for Big Bang.

As the name suggests, according to Steady state theory universe is constant it never changes. This theory states that even if Universe is expanding the relative density of the universe remain constant. The steady state theory asserts that although the universe is expanding, it does not change its look over time or in other words it has no beginning and no end. So, this theory requires that new matter must be continuously created so as to keep the overall density constant. Matter is mostly created in the form of hydrogen.

Flaws:

1) Steady state predicted that objects like quasars and radio galaxies are found everywhere. But observations and calculations suggest that this is not the case. In fact these objects are found only at large distances.

2) Another flaw in the theory is related to cosmic microwave background radiation. Steady state theory states that these radiations are from ancient stars. However, cosmologists found this explanation unconvincing, as the cosmic microwave background is very smooth, making it difficult to explain how it arose from point sources.

Tuesday, June 23, 2009

Birth of Universe-The Big-Bang

Let us start with our first (and most probable) hypothesis-Big Bang. Actually Georges Lemaitre proposed this theory. This theory is mainly based on three laws, namely: -
1) General relativity
2) Homogeneity
3) Isotropy

According to Big Bang theory, the universe is expanding or we can also say that the galaxies are moving away from each other. Law of Doppler shift proves this. So, if the universe is continuously expanding it suggest that the evident expansion in forward time required that the universe contracted backwards in time, and would continue to do so until it could contract no further, bringing all the mass of the universe into a single point, a "primeval atom", at a point in time before which time and space did not exist. As such, at this point, the fabric of time and space had not yet come into existence. If we try to extrapolate this view we will reach a point when it is not possible to contract the universe any more and we will get a primeval atom with super high density and energy.

We will try to understand this concept using the example of football. We will try to correlate both the cases. Consider a football, think it as the primeval atom (starting point of whole universe). Now suppose we are pumping air in this football and we go on pumping till the football explodes. This point of explosion is known as Big Bang. Now as the football explodes the fragmented parts start flying away from each other. These fragmented parts correspond to the galaxies and they go on moving away from each other.

This is the basis of Big- Bang theory. I will try to explain this theory in more detailed manner (theoretically and mathematically) in my further posts.

Monday, June 22, 2009

Birth of Universe

Universe consists of many galaxies, millions- billions of stars and planets. Our own planet earth is like a grain of sand in the vast ocean (universe) floor. Many of us often think that how our planet was born or how our solar system was born. This is indeed a very interesting and fascinating question. Just think how much energy is required to produce this huge amount of masses (planets and sun) or vice versa how much mass is required to produce such enormous amount of energy (imagine the amount of energy produced by sun everyday). If you are not such a science geek then let me remind you that there is a very popular theory proposed by Albert Einstein known as mass energy relationship (E=mc^2).

If this is not enough then let us think of something more bigger(or the biggest!)-How the whole universe came to existence??
Still, we do not have a definitive answer for this. But indeed we have few hypothesis. Among these the most famous (or most probable) are:
1) Big Bang Theory
2) Steady State Theory

But before indulging into these topics more deeply let us try to identify the constituents of our universe (only brief description!). Our universe mainly consist of:

a) Stars- these are the shining members of our universe. Actually, the stars emit their own light. That means they produce their own energy(Basically heat and light energy). And how they produce it?? The answer is Nuclear Fusion.
E.g.- our own Sun.

b) Planets- The best example is our own planet earth. Planets are heavenly bodies that revolve around stars (due to gravity).

c) Natural Satellites- Like our moon. These natural satellites revolve around planets but they do not have their own energy.( Moon appears to glow only because it reflects sunlight).

Apart from these their are other heavenly bodies also like asteroids, comets, meteors, etc. We will discuss about them later, because as of now we are talking only about birth of universe.

So, let us go back to our hypothesis, namely Big Bang and Steady state.
Among these two, Big- Bang is more popular and is accepted by most physicists and cosmologists. It is supported by most comprehensive and accurate scientific observations and evidence.
On the other hand, Steady State theory is a model developed in 1948 by Fred Hoyle, Thomas Gold, Hermann Bondi and others as an alternative to the Big Bang theory. But this theory has very less number of supporters as compared to Big-Bang theory.

As the name suggests, Big-Bang is about one big explosion whereas steady state refers to one static universe(it has no beginning and no end). We will discuss these two topics in more detail in my subsequent posts.

Thursday, June 18, 2009

Robots and Humanoids

There is no doubt about it; human evolution and performance are limited. Our intelligence and physical abilities are restricted and we cannot survive without vital nutrients, air and other elements, necessary for our frail human nature. Therefore it is vitally important we improve and protect our abilities, and what better way to do so, then with mechanically engineered devices. The development of robots has helped us in our endeavors, be it for national defense purposes, on an industrial level where they are part of many manufacturing processes, or on a personal level at home where all kinds of gadgets help us in mastering our chores. But where is the world of robotics leading us? Will the world soon be overrun by androids, cyborgs and other automatons?

People already rely on artificial intelligence in many aspects of their daily routine and even if most of them do not have humanlike features, they may surround us as cleaning tools, who know when to return to their docking station after their cleaning cycle is done, or when their battery runs low, or they may act as toys mimicking and behaving like favorite domesticated pets, to name just a few. If science fiction gets it right, soon a multi-tasking robotic butler, complete with arms, legs and speech will become a common household feature. Many breakthroughs in the development of artificial interaction, locomotion, navigation, manipulation and intelligence have already been made, and various scientists are working equally hard on finalizing muscle-like, spring-based motors that will upgrade the limps of their mechanical life forms and increase their humanoid efficiency and reaction time. Progress in skin development has enabled more lifelike robotic hands to sense objects and their movement. This means that if a bottle would be slipping through the robot hand, the speed bump sensors embedded in the skin would record the pressure change, react to it, and stop the bottle from falling.Robotic Science has come a long way and many laboratories have the structures to prove it.

Domo is a robot like no other. Born and built at the Massachusetts Institute for Technology, Domo is the first robot able to give a hug. With less rigid elastic actuators in his fingers, wrists, arms and neck, and joints that are more flexible than those used in other humanoids, this robot feels not only the object, but also its pressure and touch flexibility. This allows Domo to fine-tune its grip, make handshakes less ferocious and hugs more temperate.

Monday, June 15, 2009

Huygens-Mathematical Genius and Astronomer

Christiaan Huygens was born in Holland in 1629 and he was a mathematical genius who invented several important mechanical devices including the pendulum clock. He had a deep understanding of astronomy, optics, mathematics, and mechanical devices. And he built some of the largest telescopes of the 17th century. Being a contemporary of Galileo he heard about Galileo’s telescope and he set out to improve on the design. One of the liabilities of refracting telescopes like the one that Galileo built was that because of optical problems they caused distortion in how things looked. The colors of objects were separated and things were not sharp. This was a phenomenon called chromatic aberration. Huygens discovered, after much experimentation, that this problem could be lessened by building lenses with much longer focal lengths. This type of lens with a longer focal length was easier to make accurately. And using this discovery he built telescopes that were as long as 120 feet. He called his style of telescope “tubeless” because it was an open air frame without a tube; and while he did build a monster that was 120 feet long it was cumbersome and difficult to use. Most of his observations and discoveries were made with smaller telescopes around forty feet in length.

He wasn’t just a lens and telescope maker though. He was also an avid astronomer and he used his telescopes to make some discoveries under the night sky. One of his most important discoveries was of the rings of Saturn. He wasn’t the first one to actually see them. Galileo did that. But because of chromatic aberration Galileo’s telescope wasn’t good enough to resolve the rings into what they truly were. All Galileo saw was what he described as a tri-form planet which was composed of a large center section that had a section attached to each side. He described these side sections as being much like the handles of a vase. Huygens, with his better telescopes, could resolve the image of Saturn better and see what was really there and in 1656 he published his findings. He described Saturn as being composed of a central globe much like Jupiter but girdled by a thin flat ring that did not touch it.

Saturday, June 13, 2009

Digital and Film Photography

Photography is a way of life for some and at least a part of life for everyone else. In this day and age, technological advances await us at every turn, and the field of photography is no exception. Cameras have gone digital, and the potential is astounding. The following will take a look at some of the differences in the old and new ways, and weigh them out as either pros or cons of digital photography.Scientifically speaking, the differences between the two are enormous. With film photography, light traveling through the camera’s lens is actually burning the images onto the film. With digital photography, the light of the images is being encoded as binary data and stored in memory as with a computer. These differences, while huge, can be unimportant to some though. No one is actually interested in the technical aspect of how the cameras work. The photographer is more interested in what it means to him in regard to the pictures he can take and what he can do with them.One of the primary advantages of digital photography is versatility. Digital cameras can record not only the still images of film cameras, but also motion pictures and audio in some cases. While a film camera can be a specialized piece of equipment for taking still pictures, digital cameras can offer you an entire range of different equipment, all in the palm of your hand.

Digital cameras are also commonly found on other pieces of equipment. As technology advances, cellular telephones and MP3 music players now often have built-in cameras, which are always digital. This may offer some extra convenience to digital camera users, considering that they can decrease the overall number of devices that they must carry with them and use.Printing your pictures is also very different from digital to film photography. In both cases, though, you have many options. Professional film photographers may develop their prints on their own, in their very own dark room. Amateur or casual film photographers may simply drop their film off at a one-hour photo place. With digital, your pictures are recorded as electronic data, so you can use your computer to print them. Or, if you prefer, you can still drop them off at a photo shop and have it done for you. So as far as printing goes, it seems it is up to you how deeply you want to dive in. Both film and digital offer you a range of options, from the hands-on to letting others do it for you.So in the end, choosing between digital and film may mean considering the application. Hobbyists may stick to film, while technology buffs and burgeoning photographers will choose the brave new world of digital. Either way, it looks like both styles of photography are going to be around for awhile to come.

Friday, June 12, 2009

Thought Control-A Wonder Idea

Thought control? How would you like to be able to turn on your television just by thinking? Or have the door to your house open by mind power when your hands were full? This isn't something that will remain science fiction for long. The technology necessary to make this happen is here now.First of all, you have basic thought control now, meaning you can control and direct your thoughts. You can imagine a friend talking in your mind, for example. Then you can choose to hear music in your imagination. If you are hooked up to an electroencephalograph when you do these things, it will also be clear that these two thoughts are handled in different parts of your brain.This electroencephalogram, or EEG, is important, because what we can measure, we can use to do things. Think about this for a moment. Modern electronics has made it possible to easily operate things as a response to measurement. A thermostat measures the temperature, for example, and turns the heater on or off according to that measurement. Security lights turn themselves on when light levels get low.

We can already measure and track what is going on in the brain. Is it inconceivable then, to have that measurement automatically trigger some action? For example, even before the electrical patterns of the brain were made "visible," we measured pulse rate with many different machines. Now, what if instead of sending a signal to a monitor telling a red light to go on when the heart raced, the signal told the TV to turn on? Think of anything that gets your heart racing and the TV would turn on, right?Call it mind power, thought control or whatever. You can see that such a device has been possible for at least a generation now. With new technology, and more detailed measurements of the actual electric patterns of the brain, how much more is possible? Someday, an electroencephalograph type of device will be able to more directly read your mind. The technology will eventually get to the point where it can print out the actual words you are thinking. We are a long way from that, but we are right at the brink of building machines that give us thought control of the things around us.

Tuesday, May 12, 2009

Pyrolysis-Green technology

Pyrolysis is an emerging technology and its green credentials when the feed is biomass are top notch. Everyone who has lit a wood or coal fire and watched it burn has seen pyrolysis. Pyrolysis is usually the first chemical reaction that occurs in the burning of many solid organic fuels, like wood, cloth, and paper, and also of some kinds of plastic.In a wood fire, the visible flames are not due to combustion of the wood itself, but rather of the gases released by its pyrolysis; whereas the flame-less burning of embers is the combustion of the solid residue (charcoal) left behind by it. Although the basic concepts of the process have been validated, the performance data for an emerging technology have not been evaluated according to methods approved by EPA and adhering to EPA quality assurance/quality control standards.Waste is converted to a fuel by heating the waste which burns just as coal or wood does under the right controlled conditions. Whereas incineration fully converts the input waste into energy and ash, these processes limit the conversion so that combustion does not take place directly.

Waste Plastic under pressure and catalytic cracking produces fuel and can be used as a fuel source. Under certain temperature conditions the plastic macromolecular chains are broken down into small molecular chains (simple hydrocarbon compounds) and those small molecular compounds contain C4 to C20, this compound is a component of petrol, coal oil, and diesel.Anhydrous pyrolysis can also be used to produce liquid fuel similar to diesel from solid biomass.Fast pyrolysis occurs in a time of a few seconds or less. Therefore, not only chemical reaction kinetics but also heat and mass transfer processes, as well as phase transition phenomena, play important roles. Fast pyrolysis is a process in which organic materials are rapidly heated to 450 - 600 degrees C in absence of air. Under these conditions, organic vapors, permanent gases and charcoal are produced.

Friday, May 1, 2009

Optical Microscopes

Optical microscopes use visible light and a system of lenses to magnify small samples that are usually un-seen to the bare eye. The optical microscope is the first, oldest and simples type of microscope (as opposed to the much more advanced electronic microscope). The first optical microscopes were created in the 18th century. Due to it's compact sizes, simplicity and relatively low price, the optical microscope is very popular, and can be found in use in many areas of biology. Optical microscopes mostly magnify objects for up to 1500 times. The first optical microscopes were structured in a way that is called "the simple microscope". This structure utilizes only one pair of lenses to create a magnified image of the sample. Today, the simple structure is in use only in the magnifying glass, hand lens and the loupe.

The more advances optical microscopes, and the ones that are popular today, are what's called "compound optical microscopes". These microscopes use a system of many lenses, in order to "compound" and multiply the magnification, and therefore maximize it. The two main lens systems in an optical microscope are the objective lens (near the examined object), and the eyepiece lens (up near the eye of the scientist). Modern optical microscopes use multiple lenses both in the objective part as well as the eyepiece part. The old optical microscopes also used a mirror to provide illumination below the object. The modern optical microscopes use a strong lamp to provide constant and strong illumination.

The main uses of compound optical microscopes include:

The examining small pieces of material, or even a smear or a squash preparation. This is due to the fact that the optical microscope uses light to pass beneath the object and enter the lenses. That's why the item is better be half-transparent. In other uses the optical microscope may be used to examine metal samples, in order to study the metal's structure.
At low power, microscopes can be used to examine small living animals and plants. At high power, they can be used to examine bacteria. It is important to note that the vast advancement in medicinal fields and biology in general, is owed to a large extent, to the invention of the optical microscopes. For example, the way the blood flows in our body was not fully understood until the microscope made in possible to examine small blood vessels behavior.

Saturday, April 25, 2009

Information Technology Manufacturing Solutions

Rapidly changing market dynamics is pushing all businesses belonging to different industry verticals, especially manufacturing, to come up with innovative ways to streamline the ‘3 D’s of product lifecycle i.e., design, development and deployment, to deliver marketable and reliable products.

Emerging and ever changing consumer tastes, evolving regulations, and ageing demographics call for a partner who can enable technology solutions that generate tangible and measurable value for businesses in order to achieve a strategic edge on their competition. Organizations that envisage co–building along with all its stakeholders, an integrated and sustainable world by leveraging technology, always stay ahead of the rest. Focused commitment and unique business models built around relationships and customer commitment is what wins the race and sustains it till the end. One positive aspect in this race to success is that technology has become more intelligent, safer and eco-friendly.

Irrespective of the industry vertical, companies that can strategically partner with customers to reduce product development lifecycle and hasten the time to market the products, are the call of the ever changing present and the future. And in today’s cut throat drive to stay on top, customers play a major role in leveraging expertise to increase innovation, manage the globalization, besides reducing costs across the manufacturing and production cycle. These factors play a major role in bringing about faster turn around across the entire value chain, which is driven by a proven domain expertise and high knowledge capital. It enables to build a compelling business advantage and deliver end to end services.

To stay ahead of the competition, manufacturers need to globalize their process & systems efficiently through a unique blend of domain-intensive technology and process expertise that ensures delivering products faster to their target markets. In a world where success is related to time & speed to enable, there is a definite need for a partner who can achieve the strategic edge in order to determine the market leadership. That being the case, it is wise to partner with an established IT consulting organization, whose sole focus is on co-creating technology products and solutions to help customers become efficient, integrated and innovative manufacturing enterprises, with maximum attention to uncompromising quality and an initiative to drive Zero-Defects, being the objective.

Sunday, April 19, 2009

Biotechnology And Agriculture

The methods of biotechnology involved include genetic engineering (GE), genomics and bioinformatics, marker-assisted selection, micropropagation, tissue culture, cloning, artificial insemination, embryo transfer, and other technologies. In producing improved agricultural crops by genetic engineering, researchers aim for strains with the properties such as herbicide resistance, pest resistance, disease resistance, stress resistance and altered composition. All these properties will help farmers and large production companies to grow crops which will benefits in producing new products, increasing quality and quantity of existing products and improving ingredients of crops so that their production will ensure healthy and tasty food. For example, transgenic plants grew better under drought conditions and responded better when brought out of water stress. Next, such crops will allow producing potato starch with high amylopectin content for making paper, textiles and adhesives and rapeseed oil with greater erucic acid levels for plastic and industrial lubricants production.

Biotechnology is proving to be a vital complement to conventional agricultural research, improving breeding and conservation programmes and giving insights into understanding and controlling plant diseases. Aside from making conventional research more precise, GE also gives scientists the dramatic ability to transfer genetic material between organisms that normally cannot be combined through natural methods. Along with this newfound ability, however, comes new issues and concerns that need to be addressed before any large-scale adoption can take place. These include unintended transfer of transgenic genes, development of resistance by weeds, pests and diseases, and potential allergies from exotic proteins. Transparent and impartial evaluation of developed strains to answer these questions rather than rely on media hype will safeguard human health, protect the environment, and facilitate public acceptance of genetically engineered crops.

Saturday, April 18, 2009

Wireless Networks- Modulation techniques

Many of the wireless technologies in the WPAN, WLAN, and WWAN categories transmit information using radio waves. For this to take place, the data is superimposed onto the radio wave, which is also known as the carrier wave, since it carries the data. This process is called modulation. There are many modulation techniques available, all with certain advantages and disadvantages in terms of efficiency and power requirements. The modulation techniques are as follows:

1. Narrowband technology - Narrowband radio systems transmit and receive data on a specific radio frequency. The frequency band is kept as narrow as possible to allow the information to be passed. Interference is avoided by coordinating different users on different frequencies. The radio receiver filters out all signals except those on the designated frequency. For a company to use narrowband technology, it requires a license issued by the government. Examples of such companies include many of the wide area network providers.

2. Spread spectrum - By design, spread spectrum trades off bandwidth efficiency for reliability, integrity, and security. It consumes more bandwidth than narrow-band technology, but produces a signal that is louder and easier to detect by receivers that know the parameters of the signal being broadcast. To everyone else, the spread-spectrum signal looks like background noise. Two variations of spread-spectrum radio exist: frequency-hopping and direct-sequence.

a). Frequency-hopping spread spectrum (FHSS) - FHSS uses a narrowband carrier that rapidly cycles through frequencies. Both the sender and receiver know the frequency pattern being used. The idea is that even if one frequency is blocked, another should be available. If this is not the case, then the data is re-sent. When properly synchronized, the result is a single logical channel over which the information is transmitted. To everyone else, it appears as short bursts of noise. The maximum data rate using FHSS is typically around 1 Mbps.

b). Direct-sequence spread spectrum (DSSS) - DSSS spreads the signal across a broad band of radio frequencies simultaneously. Each bit transmitted has a redundant bit pattern called a chip. The longer the chip, the more likely the original data can be recovered. Longer bits also require more bandwidth. To receivers not expecting the signal, DSSS appears as low-power broadband noise and is rejected. DSSS requires more power than FHSS, but data rates can be increased to a maximum of 2 Mbps.

3. Orthogonal Frequency Division Multiplexing (OFDM) - OFDM transmits data in a parallel method, as opposed to the hopping technique used by FHSS and the spreading technique used by DSSS. This protects it from interference since the signal is being sent over parallel frequencies. OFDM has ultrahigh spectrum efficiency, meaning that more data can travel over a smaller amount of bandwidth.

Fundamentals of Wireless Networks

Wireless networks can be divided into two broad segments: short-range and long-range. Short-range wireless pertains to networks that are confined to a limited area. This applies to local area networks (LANs), such as corporate buildings, school campuses, manufacturing plants or homes, as well as to personal area networks (PANs) where portable computers within close proximity to one another need to communicate.

These networks typically operate over unlicensed spectrum reserved for industrial, scientific, medical (ISM) usage. The available frequencies differ from country to country. The most common frequency band is at 2.4 GHz, which is available across most of the globe. Other bands at 5 GHz and 40 GHz are also often used. The availability of these frequencies allows users to operate wireless networks without obtaining a license, and without charge.

Long-range networks continue where LANs end. Connectivity is typically provided by companies that sell the wireless connectivity as a service. These networks span large areas such as a metropolitan area, a state or province, or an entire country. The goal of long-range networks is to provide wireless coverage globally. The most common longrange network is wireless wide area network (WWAN). When true global coverage is required, satellite networks are also available.

Sunday, April 12, 2009

Alternative Energy- Continued

Geothermal energy is extremely abundant, since it lies directly beneath our feet, just a few miles below the earth's surface. This energy is produced by the heating of water through the actions of earth's fantastically hot molten core. The water turns to steam, which can be harnessed and used to drive turbine engines which in turn generate electricity. Great amounts of research and development should be put into geothermal energy tapping.

Waste gas energies, which are essentially methane, reverse the usual energy-pollution relationship by creating energy from waste that lies in the dumps and from some air pollutants. This gas is used in fuel cells and can be used in standard gasoline generators. Ethanol is a gasoline substitute and is created from such things as wheat, sugarcane, grapes, strawberries, corn, and even wood chips and wood cellulose. There is controversy over this fuel with regards to its ever becoming truly economical or practical except in very localized areas, but technologies for its extraction and admixturing are continuously being refined.

Biodiesel energy is created out of the oils contained in plants. So far, the commercial stores of biodiesel have been created using soybean, rapeseed, and sunflower oils. Biodiesel is typically produced by entrepreneurial minded individuals or those who want to experiment with alternative energy, but commercial interest from companies is on the rise. It burns much cleaner than oil-based diesel.

Atomic energy is created in atomic energy plants using the process of nuclear fission. This energy is extremely efficient and can generate huge amounts of power. There is concern from some people about what to do with the relatively small amount of waste product atomic energy gives off, since it is radioactive and takes hundreds of years to decay into harmlessness.

Alternative Energy Sources

There is a lot of energy that we can harness if we only seek to research and develop the technologies needed to do so. We can get away from the fossil fuels and the old electrical grids by turning to alternatives to these energy sources.

One of these alternative energy resources is wind power. Wind turbines continue to be developed that are progressively more energy efficient and less costly. "Wind farms" have been springing up in many nations, and they have even become more strategically placed over time so that they are not jeopardizing birds as former wind turbines did.

Another alternative energy resource is the one that is most well known: solar energy. This involves the manufacturing of solar cells which gather and focus the energy given off directly by the sun, and translate it into electricity or, in some cases, hot water. As with wind energy, solar energy creates absolutely zero pollution. Ocean wave energy is seen by governments and investors as having enormous energy generating potential. A generator in France has been in operation for many years now and is considered to be a great success, and the Irish and Scots are running experimental facilities.

Hydroelectric power has been with us for a while and where it is set up, it is a powerful generator of electricity and cleaner than a grid. However, there are certain limitations to the availability of the right places to set up a large dam. Many run-of-the-river, or small and localized, hydroelectric generators have been set up in recent times due to this limitation.

Friday, April 10, 2009

LASIK-The Carving Of The Cornea

LASIK or “Laser-Assisted In Situ Keratomileusis” is the only commonly performed keratomileusis procedure. Keratomileusis involves an altering of the shape of the corneal tissue with the aid of an excimer laser, which is a powerful ultraviolet laser. This laser eye surgery is performed by ophthalmologists in order to correct different types of vision impairment. LASIK is the preferred treatment for patients with refractive error, since the procedure entails rapid recovery and minimal pain overall.LASIK surgery leverages technology to its fullest.

For instance, a computer system tracks the patient’s eye position 4,000 times per second, while the lasers make the desired incisions. It might sound like a dangerous procedure, but it’s time to debunk this myth. LASIK surgery is a completely safe procedure performed with high precision. That is, the laser carves the corneal tissue in a finely controlled manner. LASIK surgery is not that cumbersome when compared with other types of eye surgery, and has a relatively low frequency of complications. Though LASIK surgery is performed with the patient awake and functional, the eye surgeon typically administers a mild sedative and anesthetic eye drops. No matter what the type of vision impairment, altering the shape of the cornea is a viable solution.

In general, the procedure has very few side effects and offers instant results. However, a few complications may arise depending on the extent of the patient’s refractive error and other irregularities in the corneal tissue.LASIK eye surgery, with excellent technology at its disposal, is improving at a rapid rate. However, there is no conclusive evidence as to the chances of long-term complications owing to the surgery.

Although relatively uncommon, a few complications may arise, namely corneal infection, slipped flap, haziness, halo or glare. An important point to note is that this laser-assisted procedure is irreversible. LASIK has gained popularity due its efficacy and improved precision. The procedure also boasts of the smallest complication statistics relative to other eye surgeries. With only a 5% complication rate, LASIK sounds like a safe enough procedure to rectify your vision impairment.

Satellite Radio

Satellite radio has quite literally been a god send to people who live or travel regularly in remote locations, or even for people who are required to travel long distances. Static-free reception can now be experienced and enjoyed by listeners who have a satellite radio even if they are in the remotest of locations.

In the past if you were travelling for long periods, every hour or so you would have to start tweaking with the radio dial, as the radio station you were just listening to began to turn to static after it seemed to fade in and out for a while. You would then frantically try to locate a new radio station to listen to and just as you did, it too would turn static. This would go on until eventually there were no decent stations left on the dial and then finally you would succumb to putting on a cassette or a CD or even turning off the entire stereo all together. But with the advent of satellite radio, static, tuning, fiddling and complete boredom will soon be a thing of the past.

The standard, more conventional radio signals are only able to travel around 30-40 miles from their original transmitters so if you travel beyond this distance then the signal will eventually get weaker and weaker until you are no longer able to hear the transmission at all. However in a far greater development of technology, satellite radio waves travel from space (around 22,000 miles) meaning that you will be able to travel across the entire country without even having to change national radio stations because the frequency will be consistent and strong.