Monday, June 29, 2009
Greenhouse Effect & Global warming
Greenhouse effect and Global warning are related to each other. Actually, Greenhouse effect is a phenomenon that leads to Global warming. The trapping of infrared solar radiations by earth's atmosphere is known as greenhouse effect. Earth's atmosphere acts as a greenhouse. Actually it’s the atmospheric gases, mainly carbon dioxide, which traps the infrared radiations. This phenomenon keeps our earth warm and prevents it from cooling excessively during nights. It works in the same way as a Green house.
Earth's atmosphere allows small wavelength (high frequency) infrared radiations that are coming from sun, to pass through it. These infrared rays pass through the atmosphere and heats up the earth. When earth gets heated it also starts emitting infrared radiations. Since earth is less hot than sun (many many times), the radiations emitted by it are of large wavelength. The atmosphere does not allow the larger wavelength radiations to pass through it and reflects it back to earth. These again cause heating of earth surface. This whole process is known as Greenhouse effect. Carbon dioxide is a greenhouse gas. More concentration of CO2 in atmosphere will lead to higher rate of infrared ray trapping, which in turn will increase the temperature of earth. This will lead to Global warming. Hence due to Greenhouse Effect the temperature of earth is constantly increasing (Global Warming).
Thursday, June 25, 2009
Steady State Theory
As the name suggests, according to Steady state theory universe is constant it never changes. This theory states that even if Universe is expanding the relative density of the universe remain constant. The steady state theory asserts that although the universe is expanding, it does not change its look over time or in other words it has no beginning and no end. So, this theory requires that new matter must be continuously created so as to keep the overall density constant. Matter is mostly created in the form of hydrogen.
Flaws:
1) Steady state predicted that objects like quasars and radio galaxies are found everywhere. But observations and calculations suggest that this is not the case. In fact these objects are found only at large distances.
2) Another flaw in the theory is related to cosmic microwave background radiation. Steady state theory states that these radiations are from ancient stars. However, cosmologists found this explanation unconvincing, as the cosmic microwave background is very smooth, making it difficult to explain how it arose from point sources.
Tuesday, June 23, 2009
Birth of Universe-The Big-Bang
1) General relativity
2) Homogeneity
3) Isotropy
According to Big Bang theory, the universe is expanding or we can also say that the galaxies are moving away from each other. Law of Doppler shift proves this. So, if the universe is continuously expanding it suggest that the evident expansion in forward time required that the universe contracted backwards in time, and would continue to do so until it could contract no further, bringing all the mass of the universe into a single point, a "primeval atom", at a point in time before which time and space did not exist. As such, at this point, the fabric of time and space had not yet come into existence. If we try to extrapolate this view we will reach a point when it is not possible to contract the universe any more and we will get a primeval atom with super high density and energy.
We will try to understand this concept using the example of football. We will try to correlate both the cases. Consider a football, think it as the primeval atom (starting point of whole universe). Now suppose we are pumping air in this football and we go on pumping till the football explodes. This point of explosion is known as Big Bang. Now as the football explodes the fragmented parts start flying away from each other. These fragmented parts correspond to the galaxies and they go on moving away from each other.
This is the basis of Big- Bang theory. I will try to explain this theory in more detailed manner (theoretically and mathematically) in my further posts.
Monday, June 22, 2009
Birth of Universe
If this is not enough then let us think of something more bigger(or the biggest!)-How the whole universe came to existence??
Still, we do not have a definitive answer for this. But indeed we have few hypothesis. Among these the most famous (or most probable) are:
1) Big Bang Theory
2) Steady State Theory
But before indulging into these topics more deeply let us try to identify the constituents of our universe (only brief description!). Our universe mainly consist of:
a) Stars- these are the shining members of our universe. Actually, the stars emit their own light. That means they produce their own energy(Basically heat and light energy). And how they produce it?? The answer is Nuclear Fusion.
E.g.- our own Sun.
b) Planets- The best example is our own planet earth. Planets are heavenly bodies that revolve around stars (due to gravity).
c) Natural Satellites- Like our moon. These natural satellites revolve around planets but they do not have their own energy.( Moon appears to glow only because it reflects sunlight).
Apart from these their are other heavenly bodies also like asteroids, comets, meteors, etc. We will discuss about them later, because as of now we are talking only about birth of universe.
So, let us go back to our hypothesis, namely Big Bang and Steady state.
Among these two, Big- Bang is more popular and is accepted by most physicists and cosmologists. It is supported by most comprehensive and accurate scientific observations and evidence.
On the other hand, Steady State theory is a model developed in 1948 by Fred Hoyle, Thomas Gold, Hermann Bondi and others as an alternative to the Big Bang theory. But this theory has very less number of supporters as compared to Big-Bang theory.
As the name suggests, Big-Bang is about one big explosion whereas steady state refers to one static universe(it has no beginning and no end). We will discuss these two topics in more detail in my subsequent posts.
Thursday, June 18, 2009
Robots and Humanoids
People already rely on artificial intelligence in many aspects of their daily routine and even if most of them do not have humanlike features, they may surround us as cleaning tools, who know when to return to their docking station after their cleaning cycle is done, or when their battery runs low, or they may act as toys mimicking and behaving like favorite domesticated pets, to name just a few. If science fiction gets it right, soon a multi-tasking robotic butler, complete with arms, legs and speech will become a common household feature. Many breakthroughs in the development of artificial interaction, locomotion, navigation, manipulation and intelligence have already been made, and various scientists are working equally hard on finalizing muscle-like, spring-based motors that will upgrade the limps of their mechanical life forms and increase their humanoid efficiency and reaction time. Progress in skin development has enabled more lifelike robotic hands to sense objects and their movement. This means that if a bottle would be slipping through the robot hand, the speed bump sensors embedded in the skin would record the pressure change, react to it, and stop the bottle from falling.Robotic Science has come a long way and many laboratories have the structures to prove it.
Domo is a robot like no other. Born and built at the Massachusetts Institute for Technology, Domo is the first robot able to give a hug. With less rigid elastic actuators in his fingers, wrists, arms and neck, and joints that are more flexible than those used in other humanoids, this robot feels not only the object, but also its pressure and touch flexibility. This allows Domo to fine-tune its grip, make handshakes less ferocious and hugs more temperate.
Monday, June 15, 2009
Huygens-Mathematical Genius and Astronomer
He wasn’t just a lens and telescope maker though. He was also an avid astronomer and he used his telescopes to make some discoveries under the night sky. One of his most important discoveries was of the rings of Saturn. He wasn’t the first one to actually see them. Galileo did that. But because of chromatic aberration Galileo’s telescope wasn’t good enough to resolve the rings into what they truly were. All Galileo saw was what he described as a tri-form planet which was composed of a large center section that had a section attached to each side. He described these side sections as being much like the handles of a vase. Huygens, with his better telescopes, could resolve the image of Saturn better and see what was really there and in 1656 he published his findings. He described Saturn as being composed of a central globe much like Jupiter but girdled by a thin flat ring that did not touch it.
Saturday, June 13, 2009
Digital and Film Photography
Digital cameras are also commonly found on other pieces of equipment. As technology advances, cellular telephones and MP3 music players now often have built-in cameras, which are always digital. This may offer some extra convenience to digital camera users, considering that they can decrease the overall number of devices that they must carry with them and use.Printing your pictures is also very different from digital to film photography. In both cases, though, you have many options. Professional film photographers may develop their prints on their own, in their very own dark room. Amateur or casual film photographers may simply drop their film off at a one-hour photo place. With digital, your pictures are recorded as electronic data, so you can use your computer to print them. Or, if you prefer, you can still drop them off at a photo shop and have it done for you. So as far as printing goes, it seems it is up to you how deeply you want to dive in. Both film and digital offer you a range of options, from the hands-on to letting others do it for you.So in the end, choosing between digital and film may mean considering the application. Hobbyists may stick to film, while technology buffs and burgeoning photographers will choose the brave new world of digital. Either way, it looks like both styles of photography are going to be around for awhile to come.
Friday, June 12, 2009
Thought Control-A Wonder Idea
We can already measure and track what is going on in the brain. Is it inconceivable then, to have that measurement automatically trigger some action? For example, even before the electrical patterns of the brain were made "visible," we measured pulse rate with many different machines. Now, what if instead of sending a signal to a monitor telling a red light to go on when the heart raced, the signal told the TV to turn on? Think of anything that gets your heart racing and the TV would turn on, right?Call it mind power, thought control or whatever. You can see that such a device has been possible for at least a generation now. With new technology, and more detailed measurements of the actual electric patterns of the brain, how much more is possible? Someday, an electroencephalograph type of device will be able to more directly read your mind. The technology will eventually get to the point where it can print out the actual words you are thinking. We are a long way from that, but we are right at the brink of building machines that give us thought control of the things around us.
Tuesday, May 12, 2009
Pyrolysis-Green technology
Waste Plastic under pressure and catalytic cracking produces fuel and can be used as a fuel source. Under certain temperature conditions the plastic macromolecular chains are broken down into small molecular chains (simple hydrocarbon compounds) and those small molecular compounds contain C4 to C20, this compound is a component of petrol, coal oil, and diesel.Anhydrous pyrolysis can also be used to produce liquid fuel similar to diesel from solid biomass.Fast pyrolysis occurs in a time of a few seconds or less. Therefore, not only chemical reaction kinetics but also heat and mass transfer processes, as well as phase transition phenomena, play important roles. Fast pyrolysis is a process in which organic materials are rapidly heated to 450 - 600 degrees C in absence of air. Under these conditions, organic vapors, permanent gases and charcoal are produced.
Friday, May 1, 2009
Optical Microscopes
The more advances optical microscopes, and the ones that are popular today, are what's called "compound optical microscopes". These microscopes use a system of many lenses, in order to "compound" and multiply the magnification, and therefore maximize it. The two main lens systems in an optical microscope are the objective lens (near the examined object), and the eyepiece lens (up near the eye of the scientist). Modern optical microscopes use multiple lenses both in the objective part as well as the eyepiece part. The old optical microscopes also used a mirror to provide illumination below the object. The modern optical microscopes use a strong lamp to provide constant and strong illumination.
The main uses of compound optical microscopes include:
The examining small pieces of material, or even a smear or a squash preparation. This is due to the fact that the optical microscope uses light to pass beneath the object and enter the lenses. That's why the item is better be half-transparent. In other uses the optical microscope may be used to examine metal samples, in order to study the metal's structure.
At low power, microscopes can be used to examine small living animals and plants. At high power, they can be used to examine bacteria. It is important to note that the vast advancement in medicinal fields and biology in general, is owed to a large extent, to the invention of the optical microscopes. For example, the way the blood flows in our body was not fully understood until the microscope made in possible to examine small blood vessels behavior.
Saturday, April 25, 2009
Information Technology Manufacturing Solutions
Emerging and ever changing consumer tastes, evolving regulations, and ageing demographics call for a partner who can enable technology solutions that generate tangible and measurable value for businesses in order to achieve a strategic edge on their competition. Organizations that envisage co–building along with all its stakeholders, an integrated and sustainable world by leveraging technology, always stay ahead of the rest. Focused commitment and unique business models built around relationships and customer commitment is what wins the race and sustains it till the end. One positive aspect in this race to success is that technology has become more intelligent, safer and eco-friendly.
Irrespective of the industry vertical, companies that can strategically partner with customers to reduce product development lifecycle and hasten the time to market the products, are the call of the ever changing present and the future. And in today’s cut throat drive to stay on top, customers play a major role in leveraging expertise to increase innovation, manage the globalization, besides reducing costs across the manufacturing and production cycle. These factors play a major role in bringing about faster turn around across the entire value chain, which is driven by a proven domain expertise and high knowledge capital. It enables to build a compelling business advantage and deliver end to end services.
To stay ahead of the competition, manufacturers need to globalize their process & systems efficiently through a unique blend of domain-intensive technology and process expertise that ensures delivering products faster to their target markets. In a world where success is related to time & speed to enable, there is a definite need for a partner who can achieve the strategic edge in order to determine the market leadership. That being the case, it is wise to partner with an established IT consulting organization, whose sole focus is on co-creating technology products and solutions to help customers become efficient, integrated and innovative manufacturing enterprises, with maximum attention to uncompromising quality and an initiative to drive Zero-Defects, being the objective.
Sunday, April 19, 2009
Biotechnology And Agriculture
Biotechnology is proving to be a vital complement to conventional agricultural research, improving breeding and conservation programmes and giving insights into understanding and controlling plant diseases. Aside from making conventional research more precise, GE also gives scientists the dramatic ability to transfer genetic material between organisms that normally cannot be combined through natural methods. Along with this newfound ability, however, comes new issues and concerns that need to be addressed before any large-scale adoption can take place. These include unintended transfer of transgenic genes, development of resistance by weeds, pests and diseases, and potential allergies from exotic proteins. Transparent and impartial evaluation of developed strains to answer these questions rather than rely on media hype will safeguard human health, protect the environment, and facilitate public acceptance of genetically engineered crops.
Saturday, April 18, 2009
Wireless Networks- Modulation techniques
1. Narrowband technology - Narrowband radio systems transmit and receive data on a specific radio frequency. The frequency band is kept as narrow as possible to allow the information to be passed. Interference is avoided by coordinating different users on different frequencies. The radio receiver filters out all signals except those on the designated frequency. For a company to use narrowband technology, it requires a license issued by the government. Examples of such companies include many of the wide area network providers.
2. Spread spectrum - By design, spread spectrum trades off bandwidth efficiency for reliability, integrity, and security. It consumes more bandwidth than narrow-band technology, but produces a signal that is louder and easier to detect by receivers that know the parameters of the signal being broadcast. To everyone else, the spread-spectrum signal looks like background noise. Two variations of spread-spectrum radio exist: frequency-hopping and direct-sequence.
a). Frequency-hopping spread spectrum (FHSS) - FHSS uses a narrowband carrier that rapidly cycles through frequencies. Both the sender and receiver know the frequency pattern being used. The idea is that even if one frequency is blocked, another should be available. If this is not the case, then the data is re-sent. When properly synchronized, the result is a single logical channel over which the information is transmitted. To everyone else, it appears as short bursts of noise. The maximum data rate using FHSS is typically around 1 Mbps.
b). Direct-sequence spread spectrum (DSSS) - DSSS spreads the signal across a broad band of radio frequencies simultaneously. Each bit transmitted has a redundant bit pattern called a chip. The longer the chip, the more likely the original data can be recovered. Longer bits also require more bandwidth. To receivers not expecting the signal, DSSS appears as low-power broadband noise and is rejected. DSSS requires more power than FHSS, but data rates can be increased to a maximum of 2 Mbps.
3. Orthogonal Frequency Division Multiplexing (OFDM) - OFDM transmits data in a parallel method, as opposed to the hopping technique used by FHSS and the spreading technique used by DSSS. This protects it from interference since the signal is being sent over parallel frequencies. OFDM has ultrahigh spectrum efficiency, meaning that more data can travel over a smaller amount of bandwidth.
Fundamentals of Wireless Networks
These networks typically operate over unlicensed spectrum reserved for industrial, scientific, medical (ISM) usage. The available frequencies differ from country to country. The most common frequency band is at 2.4 GHz, which is available across most of the globe. Other bands at 5 GHz and 40 GHz are also often used. The availability of these frequencies allows users to operate wireless networks without obtaining a license, and without charge.
Long-range networks continue where LANs end. Connectivity is typically provided by companies that sell the wireless connectivity as a service. These networks span large areas such as a metropolitan area, a state or province, or an entire country. The goal of long-range networks is to provide wireless coverage globally. The most common longrange network is wireless wide area network (WWAN). When true global coverage is required, satellite networks are also available.
Sunday, April 12, 2009
Alternative Energy- Continued
Geothermal energy is extremely abundant, since it lies directly beneath our feet, just a few miles below the earth's surface. This energy is produced by the heating of water through the actions of earth's fantastically hot molten core. The water turns to steam, which can be harnessed and used to drive turbine engines which in turn generate electricity. Great amounts of research and development should be put into geothermal energy tapping.
Waste gas energies, which are essentially methane, reverse the usual energy-pollution relationship by creating energy from waste that lies in the dumps and from some air pollutants. This gas is used in fuel cells and can be used in standard gasoline generators. Ethanol is a gasoline substitute and is created from such things as wheat, sugarcane, grapes, strawberries, corn, and even wood chips and wood cellulose. There is controversy over this fuel with regards to its ever becoming truly economical or practical except in very localized areas, but technologies for its extraction and admixturing are continuously being refined.
Biodiesel energy is created out of the oils contained in plants. So far, the commercial stores of biodiesel have been created using soybean, rapeseed, and sunflower oils. Biodiesel is typically produced by entrepreneurial minded individuals or those who want to experiment with alternative energy, but commercial interest from companies is on the rise. It burns much cleaner than oil-based diesel.
Atomic energy is created in atomic energy plants using the process of nuclear fission. This energy is extremely efficient and can generate huge amounts of power. There is concern from some people about what to do with the relatively small amount of waste product atomic energy gives off, since it is radioactive and takes hundreds of years to decay into harmlessness.
Alternative Energy Sources
One of these alternative energy resources is wind power. Wind turbines continue to be developed that are progressively more energy efficient and less costly. "Wind farms" have been springing up in many nations, and they have even become more strategically placed over time so that they are not jeopardizing birds as former wind turbines did.
Another alternative energy resource is the one that is most well known: solar energy. This involves the manufacturing of solar cells which gather and focus the energy given off directly by the sun, and translate it into electricity or, in some cases, hot water. As with wind energy, solar energy creates absolutely zero pollution. Ocean wave energy is seen by governments and investors as having enormous energy generating potential. A generator in France has been in operation for many years now and is considered to be a great success, and the Irish and Scots are running experimental facilities.
Hydroelectric power has been with us for a while and where it is set up, it is a powerful generator of electricity and cleaner than a grid. However, there are certain limitations to the availability of the right places to set up a large dam. Many run-of-the-river, or small and localized, hydroelectric generators have been set up in recent times due to this limitation.
Friday, April 10, 2009
LASIK-The Carving Of The Cornea
LASIK or “Laser-Assisted In Situ Keratomileusis” is the only commonly performed keratomileusis procedure. Keratomileusis involves an altering of the shape of the corneal tissue with the aid of an excimer laser, which is a powerful ultraviolet laser. This laser eye surgery is performed by ophthalmologists in order to correct different types of vision impairment. LASIK is the preferred treatment for patients with refractive error, since the procedure entails rapid recovery and minimal pain overall.LASIK surgery leverages technology to its fullest.
For instance, a computer system tracks the patient’s eye position 4,000 times per second, while the lasers make the desired incisions. It might sound like a dangerous procedure, but it’s time to debunk this myth. LASIK surgery is a completely safe procedure performed with high precision. That is, the laser carves the corneal tissue in a finely controlled manner. LASIK surgery is not that cumbersome when compared with other types of eye surgery, and has a relatively low frequency of complications. Though LASIK surgery is performed with the patient awake and functional, the eye surgeon typically administers a mild sedative and anesthetic eye drops. No matter what the type of vision impairment, altering the shape of the cornea is a viable solution.
In general, the procedure has very few side effects and offers instant results. However, a few complications may arise depending on the extent of the patient’s refractive error and other irregularities in the corneal tissue.LASIK eye surgery, with excellent technology at its disposal, is improving at a rapid rate. However, there is no conclusive evidence as to the chances of long-term complications owing to the surgery.
Although relatively uncommon, a few complications may arise, namely corneal infection, slipped flap, haziness, halo or glare. An important point to note is that this laser-assisted procedure is irreversible. LASIK has gained popularity due its efficacy and improved precision. The procedure also boasts of the smallest complication statistics relative to other eye surgeries. With only a 5% complication rate, LASIK sounds like a safe enough procedure to rectify your vision impairment.
Satellite Radio
In the past if you were travelling for long periods, every hour or so you would have to start tweaking with the radio dial, as the radio station you were just listening to began to turn to static after it seemed to fade in and out for a while. You would then frantically try to locate a new radio station to listen to and just as you did, it too would turn static. This would go on until eventually there were no decent stations left on the dial and then finally you would succumb to putting on a cassette or a CD or even turning off the entire stereo all together. But with the advent of satellite radio, static, tuning, fiddling and complete boredom will soon be a thing of the past.
The standard, more conventional radio signals are only able to travel around 30-40 miles from their original transmitters so if you travel beyond this distance then the signal will eventually get weaker and weaker until you are no longer able to hear the transmission at all. However in a far greater development of technology, satellite radio waves travel from space (around 22,000 miles) meaning that you will be able to travel across the entire country without even having to change national radio stations because the frequency will be consistent and strong.