Data Operations should allow for reasonable numerical fluctuations in q (Trac #493) #626
Labels
Enhancement
Feature requests and/or general improvements
Major
Big change in the code or important change in behaviour
Milestone
Data Operations allows a user to perform simple arithmetic on two data sets (add, subtract, multiply, divide). At this time it does not however interpolate so that a check is made before performing a requested operation that the two data sets have exactly the same number of q points AND that each q point is '''EXACTLY''' the same. This is generally appropriate but will cause the operation to fail even if the q values are essentially identical but differ by even one in the last decimal place.
The attached 2D files provided by Jun-Li Lin of the University of Illionois are an example. These are large data sets presumably generated by a SAXS camera? Somehow the qx of 0.0755253 (either positive or negative) is so close to 0.07552525 that in the blank data set it evaluated to 0.0755252. This was the ONLY discrepancy yet the subtract operation fails as expected. this is roughly a difference of 1.5ppm and is clearly the same value. However in order to fix we will have to decide on a cutoff: how close is close enough? A more complicated but more satisfying answer would be to allow for an option that sets the maximum allowed jiggle.
Another option that might be interesting is to allow interpolation. interpolation
Migrated from http://trac.sasview.org/ticket/493
The text was updated successfully, but these errors were encountered: