When you use feeds for a CSV import, you have to make a few decisions. When dealing with large quantity of records (say more than 2000) to import, you probably should turn OFF periodic import and turn on Process in background. This will cause feeds to import something like 50 records at a time for each cron run. In order to make feeds import more on each cron run, add the following to your settings.php:
$conf['feeds_process_limit'] = 500;
After clearing cache, each time you run cron, feeds will import 500 records. Depending on how much memory is available for PHP and how long the max execution time is set for. Here are some possible settings for a php.ini. Please note that you may need to bump these up for your particular project but be careful not to exhauset the resources of your server.
max_execution_time = 30 ; Maximum execution time of each script, in seconds
max_input_time = 60 ; Maximum amount of time each script may spend parsing request data
memory_limit = 256M ; Maximum amount of memory a script may consume (8MB)
Always save the CSV as UTF-8. You can use notepad++ to do this. I also found that if you get funny encoding errors, try uploading the spreadsheet to Google docs, then when you download it as a CSV, the encoding will be good.