aggregate_for_orientation returns a dataframe with accelerometer orientations estimated by Mizell, 2003 over each epoch (see compute_orientation). The epoch start time will be used as timestamp in the first column.

aggregate_for_orientation(
  df,
  epoch,
  estimation_window = 2,
  unit = "deg",
  st = NULL
)

Arguments

df

dataframe. Input accelerometer data in mhealth format. First column should be timestamps in POSIXt format.

epoch

string. Any format that is acceptable by argument breaks in method cut.POSIXt.For example, "1 sec", "1 min", "5 secs", "10 mins".

estimation_window

number. Duration in seconds to be used to estimate orientation within each epoch. Default is 2 (seconds), as suggested by Mizell, 2003.

unit

string. The unit of orientation angles. Can be "deg" (degree) or "rad" (radian). Default is "deg".

st

character or POSIXct timestamp. An optional start time you can set to force the epochs generated by referencing this start time. If it is NULL, the function will use the first timestamp in the timestamp column as start time to generate epochs. This is useful when you are processing a stream of data and want to use a common start time for segmenting data. Default is NULL.

Value

dataframe. The returned dataframe will have the same format as input dataframe.

Details

This function accepts a dataframe (in mhealth accelerometer data format) and computes the estimated acclerometer orientations (in x, y, and z angles) over each fixed epoch. The returned dataframe will have the same format as input dataframe, including four columns, and have the same datetime format as input dataframe in the timestamp column. The orientation estimation method used in the function is based on Mizell, 2003.

Note

If epoch argument is not provided or is NULL, the function will treat the input dataframe as a single epoch.

If the number of samples in an epoch is less than 90 would be NaN (invalid) for this epoch.

How is it used in mims-unit algorithm?

This function is used in mims-unit algorithm after extrapolation (extrapolate). The extrapolated signal will be estimated to get orientation angles using this function.

See also

aggregate_for_mims for aggregating to get integrated values for each axis for each epoch.

Other aggregate functions: aggregate_for_mims()

Examples

# Use sample input data df = sample_raw_accel_data head(df)
#> HEADER_TIME_STAMP X Y Z #> 1 2016-01-15 11:00:00 0.148 -0.438 0.016 #> 2 2016-01-15 11:00:00 0.215 -0.418 -0.023 #> 3 2016-01-15 11:00:00 0.266 -0.402 -0.012 #> 4 2016-01-15 11:00:00 0.336 -0.430 0.012 #> 5 2016-01-15 11:00:00 0.430 -0.320 0.000 #> 6 2016-01-15 11:00:00 0.535 -0.258 0.004
# set epoch to 1 second and unit to degree # last epoch does not have enough samples to estimate orientation angles. aggregate_for_orientation(df, epoch='1 sec', unit='deg')
#> HEADER_TIME_STAMP X_ANGLE Y_ANGLE Z_ANGLE #> 1 2016-01-15 11:00:00 NaN NaN NaN #> 2 2016-01-15 11:00:01 NaN NaN NaN #> 3 2016-01-15 11:00:02 NaN NaN NaN #> 4 2016-01-15 11:00:03 NaN NaN NaN #> 5 2016-01-15 11:00:04 NaN NaN NaN #> 6 2016-01-15 11:00:05 NaN NaN NaN #> 7 2016-01-15 11:00:06 NaN NaN NaN
# set epoch to 2 seconds and unit to radian # last epoch does not have enough samples to estimate orientation angles. aggregate_for_orientation(df, epoch='2 sec', unit='rad')
#> HEADER_TIME_STAMP X_ANGLE Y_ANGLE Z_ANGLE #> 1 2016-01-15 11:00:00 0.2001082 1.551401 1.769937 #> 2 2016-01-15 11:00:02 0.1908424 1.584946 1.761100 #> 3 2016-01-15 11:00:04 0.1840464 1.604471 1.751666 #> 4 2016-01-15 11:00:06 NaN NaN NaN
# epoch set to 2 seconds, and st set to be 1 second before the start time of the data # so the first segment will only include data for 1 second, therefore the resulted # aggregated value for the first segment will be -1 (invalid) because the # samples are not enough. And the second segment starts from 11:00:01, instead # of 11:00:01 as shown in prior example, aggregate_for_orientation(df, epoch = '1 sec', unit='rad', st=df[1,1] - 1)
#> HEADER_TIME_STAMP X_ANGLE Y_ANGLE Z_ANGLE #> 1 2016-01-15 11:00:00 NaN NaN NaN #> 2 2016-01-15 11:00:01 NaN NaN NaN #> 3 2016-01-15 11:00:02 NaN NaN NaN #> 4 2016-01-15 11:00:03 NaN NaN NaN #> 5 2016-01-15 11:00:04 NaN NaN NaN #> 6 2016-01-15 11:00:05 NaN NaN NaN #> 7 2016-01-15 11:00:06 NaN NaN NaN