Just to give an idea of how much the Web grew:
In March and April 1994, the World Wide Web Worm received an average of about 1500 queries per day. In November 1997, Altavista claimed it handled roughly 20 million queries per day. With the increasing number of users on the web, and automated systems which query search engines, top search engines handled hundreds of millions of queries per day by the year 2000.
As a search engine is it better high quality human maintained indices such as Yahoo! or automated search engines that rely on keyword matching?
Usually human maintained lists cover popular topics effectively but are subjective, expensive to build and maintain, slow to improve, and cannot cover all topics. Automated search engines that rely on keyword matching usually return too many low quality matches. To make matters worse, some advertisers attempt to gain people's attention by taking measures meant to mislead automated search engines.
These tasks are becoming increasingly difficult as the Web grows, but with the Internet also hardware performance and technology improve dramatically.
The goal of an effective and usefull search engine is to make it easy to find almost anything on the Web.
Since users are still only willing to look at the first few tens of results, you need very high precision (number of relevant documents returned, say in the top tens of results).
Another problem which limits the effectivness of a good Search Engine is the fact that the Web has also become increasingly commercial over time and it is very likely that it will get even more commercial in the near future.