Xscale: Difference between revisions

m
 
(5 intermediate revisions by the same user not shown)
Line 1: Line 1:
== Simple and advanced usage ==
== Simple and advanced usage ==


[http://www.mpimf-heidelberg.mpg.de/~kabsch/xds/html_doc/xscale_parameters.html XSCALE] is the stand-alone scaling program of the XDS suite. It scales reflection files (typically called XDS_ASCII.HKL) produced by XDS. Since the CORRECT step of XDS ''already scales'' an individual dataset, XSCALE is only ''needed'' if several datasets should be scaled relative to another. However, it does not deterioriate (over-fit) a dataset if it is "scaled again" in XSCALE, since the supporting points of the scalefactors are at the same positions in detector and batch space.  
[http://xds.mpimf-heidelberg.mpg.de/~kabsch/xds/html_doc/xscale_parameters.html XSCALE] is the stand-alone scaling program of the XDS suite. It scales reflection files (typically called XDS_ASCII.HKL) produced by XDS. Since the CORRECT step of XDS ''already scales'' an individual dataset, XSCALE is only ''needed'' if several datasets should be scaled relative to another. However, it does not deterioriate (over-fit) a dataset if it is "scaled again" in XSCALE, since the supporting points of the scalefactors are at the same positions in detector and batch space.  


One advantage of using XSCALE for a single dataset is that the user can specify the number and limits of the resolution shells. Another is that zero-dose extrapolation can be done.
One advantage of using XSCALE for a single dataset is that the user can specify the number and limits of the resolution shells. Another is that zero-dose extrapolation can be done.
   
   
At the XDS website, there is a short and a long commented example of [http://www.mpimf-heidelberg.mpg.de/~kabsch/xds/html_doc/INPUT_templates/XSCALE.INP XSCALE.INP]
At the XDS website, there is a short and a long commented example of [http://xds.mpimf-heidelberg.mpg.de/html_doc/INPUT_templates/XSCALE.INP XSCALE.INP]


----
----
Line 20: Line 20:
  FRIEDEL'S_LAW=FALSE   
  FRIEDEL'S_LAW=FALSE   
  STRICT_ABSORPTION_CORRECTION=TRUE        ! see XDSwiki:Tips_and_Tricks
  STRICT_ABSORPTION_CORRECTION=TRUE        ! see XDSwiki:Tips_and_Tricks
  INPUT_FILE= ../fae-rh/xds_2/XDS_ASCII.HKL
! the star in front of the file name indicates that it is the reference wrt falloff
  INPUT_FILE= *../fae-rh/xds_2/XDS_ASCII.HKL
  FRIEDEL'S_LAW=FALSE
  FRIEDEL'S_LAW=FALSE
  STRICT_ABSORPTION_CORRECTION=TRUE
  STRICT_ABSORPTION_CORRECTION=TRUE
Line 50: Line 51:


=== keywords unique to XSCALE ===
=== keywords unique to XSCALE ===
* REIDX_ISET=                    ! re-index data from the most recent INPUT_FILE
* MERGE=                        ! average intensities from all input files, applies to output file
* MERGE=                        ! average intensities from all input files, applies to output file
* WEIGHT=                        ! applies to input file
* WEIGHT=                        ! applies to input file
Line 56: Line 58:
* DOSE_RATE=                    ! (optional for radiation damage correction f.i.r.)
* DOSE_RATE=                    ! (optional for radiation damage correction f.i.r.)
* 0-DOSE_SIGNIFICANCE_LEVEL=    ! (optional for radiation damage correction f.i.r.)
* 0-DOSE_SIGNIFICANCE_LEVEL=    ! (optional for radiation damage correction f.i.r.)
* SAVE_CORRECTION_IMAGES=        ! Default is TRUE. If FALSE, don't write DECAY*.cbf MODPIX*.cbf ABSORP*.cbf


== Radiation damage correction ==
== Radiation damage correction ==
Line 110: Line 113:
and check the column "ACCEPTED REFLECTIONS". Then remove the dataset(s) with fewest accepted reflections, and re-run the program. Repeat if necessary.
and check the column "ACCEPTED REFLECTIONS". Then remove the dataset(s) with fewest accepted reflections, and re-run the program. Repeat if necessary.


The latest XSCALE (March 1, 2015) makes it explicit which dataset(s) it cannot scale; it prints out e.g. "no common reflections with data set          197". If you get this message for many datasets, I suggest to have a line
XSCALE makes it explicit which dataset(s) it cannot scale; it prints out e.g. "no common reflections with data set          197".  
MINIMUM_I/SIGMA=2 ! reduce to 1, or 0.5, or 0.25, or 0.125, or ... to lower the cutoff
after each INPUT_FILE= line, to increase the number of reflections available for scaling. However, MINIMUM_I/SIGMA= should not be decreased needlessly below its default of 3.


Old versions of XSCALE may also finish with the error message !!! ERROR !!! INACCURATE SCALING FACTORS. This usually indicates that one or more datasets are linearly depending on others (this happens if the ''same'' data are included more than once as INPUT_FILE), or are pure noise. The latest version of XSCALE (March 1, 2015) copes much better with this situation; I have not seen this error message any more.
XSCALE may also finish with the error message !!! ERROR !!! INACCURATE SCALING FACTORS. This usually indicates that one or more datasets are linearly depending on others (this happens if the ''same'' data are included more than once as INPUT_FILE), or are pure noise.
2,652

edits