The following tips and tricks will help you import large datasets. If you are working with a small dataset, you can use the menu item Develop > Import data. See
Import Data.
Copy Paste from Excel to Form
Keep the number of rows under 10,000 and the width less than 8,000 characters. We've noticed IE pastes large data sets better than Firefox. Sometimes the browser will crash if you try to paste a very large data set into a text field. So if this happens, try using one of the methods below. You can also paste this data into a .txt file and upload it using the Tab delimited file method described below.
Tab Delimited File Upload
One flat file import can be many megabytes, but upload time will depend on your connection speed. Since this is just one flat file, only a single table will be built. All the data should be consistent, meaning all date fields should be valid dates, numbers should all be numbers, no letters in the number fields. These rules apply to all import methods.
Excel 97/2000 .xls Import
File size < 20 MB, < 100,000 records per sheet, fewer than 20 sheets. Match sheet names to table names in Qrimp. Use this if you want to build the relational model to support multiple data sets automatically. Each sheet will be a table.
Importing Large Datasets
If you are converting an existing application or have extremely large sets of data you need to import, then the web is not a viable option. Because network latency, script timeouts, and other factors designed for a high scale environment like a web application are not designed to accommodate very long running batch processes, we need to import large sets without using the web.
Qrimp Hosted
If you are using Qrimp's Hosted solution, you can import data using the Rest-like API. This could take a while with millions of records to import, so in a scenario like this, it may always be best to contact support first. You can also download the
Qrimp Server Installation and use the method described below.
Qrimp Enterprise Solution
If you are using the Enterprise solution installed on your local network, there are many options available for importing data. You will probably want to get help from someone experienced with Microsoft SQL Server to perform one of these import options.
The first step to importing a large data set with Qrimp installed is to define the application. This is usually easiest using Qrimp's web interface to define tables manually. This process will create the underlying primary keys, foreign keys, and support tables needed to run your Qrimp application.
A more difficult, but more flexible way is to define the data model directly using SQL Management Studio, export from UML, Earl, or some other relational modeling tool. For this to work, each table must have an INT IDENTITY field named id.
The next step is to get the data into your data model.
Data Transformation Services (DTS)
Qrimp is built on top of the Microsoft SQL Server product, so getting data into Qrimp is similar to migrating data between any existing databases. The best way we have found is to use Microsoft's DTS to import the data. Here are a few resources on the Web to help you understand how to use DTS to import your data.
Creating a DTS Package with SQL Server 2005
Creating a DTS Package with SQL Server 2008
Data import / export with SQL Server Express using DTS Wizard
You can download DTS for SQL Server 2005 Express here:
http://download.microsoft.com/download/4/4/D/44DBDE61-B385-4FC2-A67D-48053B8F9FAD/SQLServer2005_DTS.msi
Download More Microsoft SQL Server 2005 Express Tools Here