You first need to understand search engines, and how they work. First of all, let me begin by defining a search engine. You may be familiar with Google, Inc. - the most prevelant search engine at the time of this writing. There are other search engines that have worn the crown in the past, including AltaVista and Inktomi to name a few.
A search engine is typically in the form of a web site that provides the visitor with an HTML form to fill out, which then searches the search engine's database of content. With Google's web site, you can enter specific "keywords" into the form, and it will find all relevant web pages on the internet that contain those search terms you entered. For example, if I search for "horse veterinarians" I would expect the results to be web pages which have the words "horse" and "veterinarians" in them. Of course there is a ranking order applied through a search engine's algorithm to determine which site is the most relevant.
Search engines, for the most part, all work the same way. A search engine company operates what is called a "spider" - an automated program that "crawls" the Internet looking for web sites. The spider then reads the content on those web pages it finds, and classifies it and assigns value to that page in terms of backlinks. In Google's case, it assigns what's called a PageRank© (more on this later).
A deep crawl is when a spider, for example Googlebot, crawls the entire web site. The first time a site is encountered by Googlebot, it only indexes the first page. It can be up to 1 full month before Googlebot comes back to the site and does a deep crawl, indexing every page that is interlinked on a web site. Patience is a virtue with SEO, as improvements are not seen overnight.