Data Controller for SAS: File Uploads¶
Data Controller supports the ingestion of two file formats - Excel (any version) and CSV.
If you would like to support other file types, do get in touch!
Data can be uploaded in regular (tabular) or dynamic (complex) format. For details, see the excel.
The following should be considered when uploading data in this way:
- A header row (with variable names) is required
- Variable names must match those in the target table (not case sensitive). An easy way to ensure this is to download the data from Viewer and use this as a template.
- Duplicate variable names are not permitted
- Missing columns are not permitted
- Additional columns are ignored
- The order of variables does not matter EXCEPT for the (optional)
_____DELETE__THIS__RECORD_____variable. When using this variable, it must be the first.
- The delimiter is extracted from the header row - so for
var1;var2;var3the delimeter would be assumed to be a semicolon
- The above assumes the delimiter is the first special character! So
- The following characters should not be used as delimiters
When loading dates, be aware that Data Controller makes use of the
ANYDTDTTME informats (width 19).
This means that uploaded date / datetime values should be unambiguous (eg
01/02/42), to avoid confusion - as the latter could be interpreted as
02JAN2042 depending on your locale and options
YEARCUTOFF settings. Note that UTC dates with offset values (eg
2018-12-26T09:19:25.123+0100) are not currently supported. If this is a feature you would like to see, contact us.
To get a copy of a file in the right format for upload, use the file download feature in the Viewer tab
Lengths are taken from the target table. If a CSV contains long strings (eg
"ABCDE" for a $3 variable) then the rest will be silently truncated (only
"ABC" staged and loaded). If the target variable is a short numeric (eg 4., or 4 bytes) then floats or large integers may be rounded. This issue does not apply to excel uploads, which are first validated in the browser.
When loading CSVs, the entire file is passed to backend for ingestion. This makes it more efficient for large files, but does mean that frontend validations are bypassed.