There are two aspects of URLs which are of significant import to the SEO practitioners. One aspect is related to taking of the URL which carries the keywords and the other is related to dynamic-static nature of the URLs.
While many SEO services providers would be advising that the keyword stuffed URLs do not count, there are hardly any takers for their advice in reality. Afterall, no one wants to lose out to their competitors on the score that keyword-rich URL was not taken. That is definitely not a problem and therefore, all said, done and heard, they opt to play on the safe side and take the keyword rich URLs.
The enigma of the search engine behaviour towards the dynamic websites and the extent to which these are SEO-friendly is something which is of even greater import than the keyword rich URLs. Mind you these are not the ones which are showing the flash animations or the ones which are featuring some sort of literal dynamism. Dynamic websites are the ones whose pages could be created during the run or on the fly. There may not be any physical presence of the page (consequently you can not download the same using the ftp protocol) but it could be generated from the content or the information stored in the server side databases. Each dynamic page is linked with a master template page. The benefit is that limitless number of pages can be generated and it renders creation of the multiple-pages website very easy.
Consequently the work of designers is rendered easier. Whether a site is dynamic or not can be seen by checking the URL of pages contain the special characters and query strings or not. If these are present, then the site is of dynamic nature.
The pitfall of such a dynamic site is that the search engines are not able to read through the full URL and, to the extent to which they are able to read the URL and if it is same for many different dynamically generated pages, the search engine bots might mistake it to be the same URL. For example, if there are following different pages,
, and the search engines are able to read till the last special character & but not beyond it, then the three URLs will be same, indicating that there is a problem of duplicate content pages. While many argue that the present search engines are able to read through 4 to 6 different special characters and differentiate the pages, the problem is that if the URL is longer than what these can read, the problem will persist. Even otherwise, the crawling speed will be slow and the rankings will get affected.
So, what shall be done? The strategy which is most commonly used is to change the URLs to the static ones by using the various tools available for this purpose. But, what would you do if the client refuses to do the same. You probably have no option but to go ahead with the SEO of the dynamic site. The results will also come just like the ones that are obtained for the static sites. The only problem will be that the rankings will fluctuate violently. These will keep coming and going and there will be less stability. So, you will have to live with it and the client will have to understand the same as well.