ds2dd

2023-09-07

library(stRoke)

Easy data set to data base workflow

This function can be used as a simple tool for creating at data base metadata file for REDCap (called a DataDictionary) based on a given data set file.

Step 1 - Load your data set

Here we’ll use the sample TALOS dataset included with the package.

data("talos")
ds <- talos
# As the data set lacks an ID column, one is added
ds$id <- seq_len(nrow(ds))

Step 2 - Create the DataDictionary

datadictionary <- ds2dd(ds,record.id = "id",include.column.names = TRUE)

Now additional specifications to the DataDictionary can be made manually, or it can be uploaded and modified manually in the graphical user interface on the web page.

The function will transform column names to lower case and substitute spaces for underscores. The output is a list with the DataDictionary and a vector of new column names for the dataset to fit the meta data.

Step 3 - Meta data upload

Now the DataDictionary can be exported as a spreadsheet and uploaded or it can be uploaded using the REDCapR package (only projects with “Development” status).

Use one of the two approaches below:

Manual upload

write.csv(datadictionary$DataDictionary,"datadictionary.csv")

Upload with REDCapR

REDCapR::redcap_metadata_write(
  datadictionary$DataDictionary,
  redcap_uri = keyring::key_get("DB_URI"),
  token = keyring::key_get("DB_TOKEN")
)

In the “REDCap R Handbook” more is written on interfacing with REDCap in R using the library(keyring)to store credentials in chapter 1.1.

Step 4 - Data upload

The same two options are available for data upload as meta data upload: manual or through REDCapR.

Only the latter is shown here.

# new column names are applied
colnames(ds) <- datadictionary$`Column names`

REDCapR::redcap_write(
  ds,
  redcap_uri = keyring::key_get("DB_URI"),
  token = keyring::key_get("DB_TOKEN")
)