A large CSV file with thousands of records causes the import process to fail. This is because the CSV file may not have any lat/lng values. Therefore, the import process takes too much time to complete. In addition, the error message "LAT/LNG Fetch Failed" appears when the import tool fails to fetch the coordinates of the provided address.
To prevent the "LAT/LNG Fetch Failed" message from appearing, please make sure that the columns "State", "City", and "Zipcode" are provided for all the stores. Furthermore, if any of these columns are missing the import tool will fail to fetch the coordinates due to validation.
Agile Store Locator is capable of handling a CSV file with thousands of records and imports them within seconds. However, an issue occurs when the system experiences a time out due to exceeding the “max_execution_time”, which happens when the lat/lng data is missing from the CSV file.
Lastly, the missing lat/lng coordinates are automatically filled by Google Geocoding API, and it takes less than 1 second to fill in each missing coordinate. Therefore, it will take quite a bit of time for the import process if the CSV file has thousands of records with missing coordinates.
Increase the “max_execution_time” for your server (Apache/Nginx) and see if the problem persists.
To increase the import speed in Ubuntu Server, please follow the below steps:
1- Open up php.ini
2- Find a line ‘max_execution_time’, you can find using ctrl+F
3- Replace default value of max_execution_time=30 to max_execution_time=0
4- Save php.ini
5- Now restart Apache/Nginx server
Split your CSV files into multiple smaller CSV files and then import each one into Agile Store Locator separately.
<div class="figure"> </div>
A large CSV file with thousands of records causes the import process to fail. This is because the CSV file may not have any lat/lng values. Therefore, the import process takes too much time to complete.