Wednesday 25 February 2015

Know the Different Types of Mining Processes

Mining has become a controversial industry because of its "devastating" effect to the environment and the ecosystem. However, it has contributed so much to civilization that without it, we could never be where we are today in many aspects.

There are two basic methods of mining. These are the surface and the underground mining processes:

1. Surface Mining

This involves the mining of minerals located at or near the surface of the earth. This encompasses at least six processes and these are:

• Strip Mining - this involves the stripping of the earth's surface by heavy machinery. This method is generally targeted at extracting coal or sedimentary rocks that lay near the earth's surface.

• Placer Mining - this involves the extraction of sediments in sand or gravel. It is a simple, old-fashioned way of mining. This method is generally applicable to gold and precious gems that are carried by the flow of water.

• Mountain Top Mining - this is a new method which involves blasting of a mountain top to expose coal deposits that lie underneath the mountain crest.

• Hydraulic Mining - this is an obsolete method that involves jetting the side of a mountain or hill with high pressure water to expose gold and other precious metals.

• Dredging - it involves the removal of rocks, sand and silt underneath a body of water to expose the minerals.

• Open Pit - this is the most common mining method. It involves the removal of the top layers of soil in search for gold or buried treasure. The miner digs deeper and deeper until a large, open-pit is created.

2. Underground Mining

This is the process in which a tunnel is made into the earth to find the mineral ore. The mining operation is usually performed with the use of underground mining equipment. Underground mining is done through the following methods:

• Slope Mining - it involves the creation of slopes into the ground in order to reach the ore or mineral deposit. This process is generally applied in coal mining.

• Hard rock - this method uses dynamite or giant drills to create large, deep tunnels. The miners support the tunnels with pillars to prevent them from collapsing. This is a large-scale mining process and is usually applied in the extraction of large copper, tin, lead, gold or silver deposits.

• Drift mining - this method is applicable only when the target mineral is accessible from the side of a mountain. It involves the creation of a tunnel that's slightly lower than the target mineral. The gravity makes the deposit fall to the tunnel where miners can collect them.

• Shaft method - this involves the creation of a vertical passageway that goes deep down underground where the deposit is located. Because of the depth, miners are brought in and out of the pit with elevators.

• Borehole method - this involves the use of a large drill and high pressure water to eject the target mineral.

These are the basic methods used in the extraction of common minerals. There are more complex systems, but still, they are based on these fundamental processes.

Source: http://ezinearticles.com/?Know-the-Different-Types-of-Mining-Processes&id=7932442

Saturday 21 February 2015

Ancient Basic Tools to Green Light Laser: The Evolution of Mining

Mining is the process of extracting minerals and geological materials from the earth. Miners help recover many elements. These materials are rare as they are not grown, agriculturally processed or artificially created. Precious metals, coal, diamonds, and gold are just some of these materials. Mining also helps man to unearth non-renewable energy source like natural gas, petroleum, and even water. The job of miners can be difficult and risky. Thanks to efficient mining equipment, the task is a lot easier now.

People of the ancient time made use of the earth for many purposes. One way to make a living at the time is by mining. Equipment were not fully developed but people managed to unearth many precious stones and different kinds of metals. They use these minerals and elements in making basic tools for hunting and warfare. High quality flints found in masses of sedimentary rocks were in-demand in many parts of Europe. People used these flints as weapons during the Stone Age.

Ancient Egyptians were among the first to successfully get minerals from earth. Their advanced level of civilization made it possible for them to produce quality mining tools. They mined malachite and gold. Malachites are green stones used for pottery and as ornaments. The Egyptians started to quarry for other minerals not found in their soils. They head to Nubia, a part of Africa. There they used iron tools as mining equipment. That was the time when fire-setting was used to extract gold from ores. This method involves setting the rock containing the mineral against another rock, heat it and douse it with water. This was the most effective mining method that time.

The Romans also played an important part in the history of mining. They were the first to use large scale quarrying methods. An example of this is the application of volumes of water to operate simple machinery and remove debris. This is the birth of hydraulic mining.

The demand for metal increased dramatically in the 1300s. This was the time when swords, armors, and other weapons were in-demand. For this reason, miners looked for more sources of iron and silver. There was also an increase in the demand for coins that caused shortage of silver. Iron, on the other hand, was utilized in building constructions. With the high value of these materials, machineries and other mining equipment became in demand in the market.

These machines and equipment were the mothers of the present mining tools that we have today. Miners today use bulldozers, explosives and trucks. More advanced form of mining tools includes the use of green light laser serving as saw guides and machine alignment. With all these modern equipment, miners now have a safer and faster process to break down rocks and even carve out mountains. All these materials are produced and applied with the principles of engineering.

As of today, there are five major mining categories. They are coal, metal ore, non-metallic mineral mining, oil and gas extraction. Oil and gas extraction is among the biggest industries in the world today.

Source: http://ezinearticles.com/?Ancient-Basic-Tools-to-Green-Light-Laser:-The-Evolution-of-Mining&id=6768619

Tuesday 17 February 2015

Revitalize and Refresh Your Home With a Dry Organic Deep Extraction Carpet Cleaning

While everyone is familiar with the old-style of water intensive steam-based carpet cleaning methods, few are aware of the benefits of high powered dry extraction carpet cleaning technology. With the environmental concerns of today, and water shortages throughout the country, dry extraction carpet cleaning is starting to gain popularity. This method employs the use of vigorous agitation, deep cleaning organic and biodegradable cleansing materials, and high powered vacuum extraction, to rejuvenate and cleanse deep into the carpet fibers.

The agitation system is composed of two counterrotating nylon brushes which are safe for any synthetic and natural carpet fiber. Natural material carries the cleaning agents and is spread similar to that of carpet powder. High vacuum pressure utilizing HEPA filtration extracts deep down dirt, grime and mold particles. Dry extraction carpet cleaning, while utilizing no water, will leave the carpet ready to walk on as soon as the cleaning is finished.

With old-style steam carpet cleaning it is oftentimes required to use several hundred gallons of water to achieve the same results. And while this type of carpet cleaning may seem less expensive, what many of these companies don't tell you, is that the water they will be using will come from your own tap. Many of the cheapest steam cleaning companies will simply utilize steam cleaning machines which will pump the used water back into your yard.

Dry extraction carpet cleaning requires no additional or hidden costs from the customer. The equipment is lightweight and easy to maneuver, allowing the technician to finish the job usually in half the time of conventional steam methods. Utilizing completely biodegradable organic carrier agents, this material is worked into the carpet to achieve the cleaning and then extracted through high-powered vacuum. The twin brushed agitation method stretches and extends the carpet pile, leaving a texture similar to that of freshly laid carpet.

The scent is pleasant and not overwhelming, leaving the home smelling fresh. Dry extraction carpet cleaning has been around for quite a few years commercialy, but only now is it starting to gain recognition and serious competition to other carpet cleaning services. When looking around for your next carpet cleaning service, consider a dry deep extraction system. With this method there are no harsh chemicals or cleaning agents that can harm your carpets, whether they are wool, shag, cut pile or premium import.

Try dry extraction cleaning the next time you want your carpet deep down clean, you won't be disappointed.

For the absolute best home cleaning and maid service on Metro North Atlanta. MaidPro can get the job done, you dirty it, we can clean it- guaranteed! We only use safe, organic, hypoallergenic, cleaning supplies and systems.

Source:http://ezinearticles.com/?Revitalize-and-Refresh-Your-Home-With-a-Dry-Organic-Deep-Extraction-Carpet-Cleaning&id=1608594

Thursday 12 February 2015

Why Common Measures Taken To Prevent Scraping Aren't Effective

Bots became more powerful in 2014. As the war continues, let’s take a closer look at why common strategies to prevent scraping didn’t pay off.

With the market for online businesses expanding rapidly, the development teams behind these online portals are under great amounts of pressure to keep up in the race. Scalability, availability and responsiveness are some of the commonly faced problems for a growing online business portal. As the value of content is increasing, content theft has become an increasing problem in the form of web scraping.

Competitors have learned to stay ahead of the race by using bots to scrape. While how these bots could be harmful is something worth talking about, it is not the main scope of this article. This article discusses some of the commonly used weapons to fight bots and brings to light their effectiveness in reality.

We come across many developers who claim to have taken measures to prevent their sites from being scraped. A common belief is that these below listed techniques reduce scraping activities significantly on a website. While some of these methods could actually work in concept, we were interested to explore how effective they were in practice.

Most Commonly used techniques to Prevent Scraping:

•    Setting up robots.txt – Surprisingly, this technique is used against malicious bots! Why this wouldn’t work is pretty straight forward – robots.txt is an agreement between websites and search engine bots to prevent search engine bots from accessing sensitive information. No malicious bot (or the scraper behind it) in it’s right mind would obey robots.txt. This is the most ineffective method to prevent scraping.

•    Filtering requests by User agent – The user agent string of a client is set by the client itself. One method is to obtain this from the HTTP header of a request. This way, a request can be filtered even before the content is served to the request. We observed that very few bots (approximately less than 10%), used the default user agent string which belonged to a scraping tool or was an empty string. Once their requests to the website were filtered based on the user agent, it didn’t take too long for scrapers to realize this and change their user agent to that of any well known browser. This method merely stops new bots written by inexperienced scrapers for a few hours.

•    Blacklisting the IP address – Seeking out to an IP blacklisting service is much easier than having to perform the hectic process of capturing more metrics from page requests and analyzing server logs. There are plenty of third party services which maintain a database of blacklisted IPs. In our hunt for a suitable blacklisting service, we found that using a third party DNSBL/RBL service was not effective as these services blacklisted only email spambot servers and were not effective in preventing scraping bots. Less than 2% of scraping bots were detected for one of our customer’s when we did a trial run.

•    Throwing CAPTCHA – A very well know practice to stop bots is to throw CAPTCHA on pages with sensitive content. Although effective against bots, CAPTCHA is thrown to all clients requesting the web page irrespective of whether it is a human or a bot. This method often antagonizes users and hence reduces traffic to the website. Some more insights to the new NO CAPTCHA Re-CAPTCHA by Google can be found in our previous blog post.

•    Honey pot or Honey trap – Honey pots are a brilliant trap mechanism to capture new bots (scrapers who are not well versed with structure of every page) on the website. But, this approach poses a lesser known threat of reducing the page rank on search engines. Here’s why – Search engine bots visit these links and might get trapped accidentally. Even if exceptions to the page were made by disallowing a set of known user agents, the links to the traps might be indexed by a search engine bot. These links are interpreted as dead, irrelevant or fake links by search engines. With more such traps, the ranking of the website decreases considerably. Furthermore, filtering requests based on user agent can exploited as discussed above. In short, honey pots are risky business which must be handled very carefully.

To summarize, these prevention strategies listed are either weak or require constant monitoring and regular maintenance to keep them effective. In practice bots are far more challenging than they actually seem to be.

What to expect in 2015?

With increasing need for scraping, the number of scraping tools and expert scrapers are also increasing which simply means bots are going to be an increasing problem. In fact, the usage of headless browsers i.e, browser like bots which are used to scrape are increasing and scrapers are no longer relying on wget, curl and html parsers. Preventing malicious bots from stealing content without actually disturbing the genuine traffic from humans and search engine bots is just going get harder. By the end of the year, we could infer from our database that almost half of an average website’s traffic is caused by bots. And a whopping 30-40% is caused by malicious bots. We believe this is only going to increase if we do not step up to take action!

p.s. If you think you are facing similar problems, why not request for more information? Also, if you do not have the time or bandwidth for taking such actions, scraping prevention and stopping malicious bots is something we provide as a service. How about a free trial?

Source:http://www.shieldsquare.com/why-common-measures-taken-to-prevent-scraping-arent-effective/