116
edits
mNo edit summary |
|||
(2 intermediate revisions by 2 users not shown) | |||
Line 575: | Line 575: | ||
but it traces only about 62 residues. The density looks somewhat reasonable, though. | but it traces only about 62 residues. The density looks somewhat reasonable, though. | ||
The files [ | The files [https://{{SERVERNAME}}/pub/xds-datared/1g1c/xds-simulated-1g1c-I.mtz xds-simulated-1g1c-I.mtz] and [https://{{SERVERNAME}}/pub/xds-datared/1g1c/xds-simulated-1g1c-F.mtz xds-simulated-1g1c-F.mtz] are available. | ||
I refined against 1g1c.pdb: | I refined against 1g1c.pdb: | ||
Line 623: | Line 623: | ||
showing that the anomalous completeness, and even the quality of the anomalous signal, can indeed be increased. I doubt, however, that going to three or more frames would improve things even more. | showing that the anomalous completeness, and even the quality of the anomalous signal, can indeed be increased. I doubt, however, that going to three or more frames would improve things even more. | ||
The MTZ files are at [ | The MTZ files are at [https://{{SERVERNAME}}/pub/xds-datared/1g1c/xds-simulated-1g1c-F-2frames.mtz] and [https://{{SERVERNAME}}/pub/xds-datared/1g1c/xds-simulated-1g1c-I-2frames.mtz], respectively. They were of course obtained with XDSCONV.INP: | ||
INPUT_FILE=temp.ahkl | INPUT_FILE=temp.ahkl | ||
OUTPUT_FILE=temp.hkl CCP4_I | OUTPUT_FILE=temp.hkl CCP4_I | ||
Line 638: | Line 638: | ||
</pre> | </pre> | ||
Using the default (see above) phenix.refine job, I obtain against the [ | Using the default (see above) phenix.refine job, I obtain against the [https://{{SERVERNAME}}/pub/xds-datared/1g1c/xds-simulated-1g1c-F-2frames.mtz MTZ file with amplitudes]: | ||
Start R-work = 0.3434, R-free = 0.3540 | Start R-work = 0.3434, R-free = 0.3540 | ||
Final R-work = 0.2209, R-free = 0.2479 | Final R-work = 0.2209, R-free = 0.2479 | ||
and against the [ | and against the [https://{{SERVERNAME}}/pub/xds-datared/1g1c/xds-simulated-1g1c-I-2frames.mtz MTZ file with intensities] | ||
Start R-work = 0.3492, R-free = 0.3606 | Start R-work = 0.3492, R-free = 0.3606 | ||
Final R-work = 0.2244, R-free = 0.2504 | Final R-work = 0.2244, R-free = 0.2504 | ||
Line 724: | Line 724: | ||
=== Finally solving the structure === | === Finally solving the structure === | ||
After thinking about the most likely way that James Holton used to produce the simulated data, I hypothesized that within each frame, the radiation damage is most likely constant, and that there is a jump in radiation damage from frame 1 to 2. Unfortunately for this scenario, the scaling algorithm in CORRECT and XSCALE was changed for the version of Dec-2010, such that it produces best results when the changes are smooth. Therefore, I tried the penultimate version of XSCALE - and indeed that gives significantly better results: | After thinking about the most likely way that James Holton used to produce the simulated data, I hypothesized that within each frame, the radiation damage is most likely constant, and that there is a jump in radiation damage from frame 1 to 2. Unfortunately for this scenario, the scaling algorithm in CORRECT and XSCALE was changed for the version of Dec-2010, such that it produces best results when the changes are smooth. Therefore, I tried the penultimate version (May-2010) of XSCALE - and indeed that gives significantly better results: | ||
NOTE: Friedel pairs are treated as different reflections. | NOTE: Friedel pairs are treated as different reflections. | ||
Line 754: | Line 754: | ||
total 165799 42025 43003 97.7% 11.7% 12.3% 162399 10.07 13.5% 14.8% 17% 0.908 16219 | total 165799 42025 43003 97.7% 11.7% 12.3% 162399 10.07 13.5% 14.8% 17% 0.908 16219 | ||
Using these data (stored in [ | Using these data (stored in [https://{{SERVERNAME}}/pub/xds-datared/1g1c/xscale.oldversion]), I was finally able to solve the structure (see screenshot below) - SHELXE traced 160 out of 198 residues. All files produced by SHELXE are in [https://{{SERVERNAME}}/pub/xds-datared/1g1c/shelx]. | ||
[[File:1g1c-shelxe.png]] | [[File:1g1c-shelxe.png]] | ||
It is worth mentioning that James Holton confirmed that my hypothesis is true; he also | |||
However, generally the smooth scaling gives better results than the previous method of assigning the same scale factor to all reflections of a frame; in particular, it correctly treats those reflections near the border of two frames. This example shows that it is important to | It is worth mentioning that James Holton confirmed that my hypothesis is true; he also says that this approach is a good approximation for a multi-pass data collection. | ||
However, generally (i.e. for real data) the smooth scaling (which also applies to absorption correction and detector modulation) gives better results than the previous method of assigning the same scale factor to all reflections of a frame; in particular, it correctly treats those reflections near the border of two frames. | |||
Phenix.refine against these data gives: | |||
Start R-work = 0.3449, R-free = 0.3560 | |||
Final R-work = 0.2194, R-free = 0.2469 | |||
which is only 0.15%/0.10% better in R-work/R-free than the previous best result (see above). | |||
This example shows that it is important to | |||
* have the best data available if a structure is difficult to solve | * have the best data available if a structure is difficult to solve | ||
* know the options (programs, algorithms) | * know the options (programs, algorithms) | ||
* know as much as possible about the experiment | * know as much as possible about the experiment |