In this section we will discuss the status of the HET facility and each instrument and any limitation to configurations that occurred during the period.
NOTE: We are now using FWHM which is measured off both the LRS "pre" images and the FIF bent prime guider.
Here is a histogram from the last period.
For comparison here is the image quality for the period last year.
We expect to do about as good as the previous period and year. Note we have changed from a GFWHM on the LRS to a FWHM and we are measuring the FWHM on the ACQ camera using the same software.
We expect to have a similar overhead to last period. The following overhead statistics include slew, setup, readout and refocus between exposures (if there are multiple exposures per visit). In the summary page for each program the average setup time is calculated. The table below gives the average setup time for each instrument PER REQUESTED VISIT AND PER ACTUAL VISIT. The table also gives the average and maximum PROPOSED science exposures and visit lengths.
The "Exposure" is defined by when the CCD opens and closes. A "Visit" is the requested total CCD shutter open time during a track and might be made up of several "Exposures". "Visit" as defined here contains no overhead. To calculate one type of observing efficiency metric one might divide the "Visit" by the sum of "Visit" + "Overhead".
The average overhead per actual visit is the overhead for each acceptable priority 0-3 (not borderline, and with overheads > 4 minutes to avoid 2nd half of exposures with unrealistically low overheads) science target. This number reflects how quickly we can move from object to object on average for each instrument, however, this statistic tends to weight the overhead for programs with large number of targets such as planet search programs.
The average overhead per requested visit is the total charged overhead per requested priority 0-3 visit averaged per program. To get this value we average the average overhead for each program as presented in the program status web pages. The average overhead per visit can be inflated by extra overhead charged for PI mistakes (such as bad finding charts or no targets found at sky detection limits) or for incomplete visits e.g. 2 visits of 1800s are done instead of 1 visit with a CRsplit of 2. The average overhead per visit can be deflated by the 15 minute cap applied to the HRS and MRS. This method tends to weight the overhead to programs with few targets and bad requested visit lengths, ie. very close to the track length.
Instrument | Avg Overhead per Actual Visit(min) | Avg Overhead per Requested Visit(min) | Avg Exposure (sec) | Max Exposure (sec) | Avg Visit (sec) | Max Visit (sec) |
---|---|---|---|---|---|---|
LRS | 12.3(from last period) | 12.3(from last period) | 878.8 | 2000 | 1287.6 | 3600 |
HRS | 6.5 (from last period) | 6.8(from last period) | 618.8 | 3600 | 737.8 | 4800 |
MRS | 11.8(from last period) | 10.3(from last period) | N/A | N/A | N/A | N/A |
NOTE: AS OF 2003-3 THE SETUP TIME FOR AN ATTEMPTED MRS OR HRS TARGET IS CAPPED AT 15 MINUTES.
NOTE: STARTING IN 2004-3 THE SETUP TIME FOR AN ATTEMPTED LRS TARGET WILL CAPPED AT 20 MINUTES.
The overhead statistics can be shortened by multiple setups (each one counted as a separate visit) while on the same target as is the case for planet search programs. The overhead statistics can be lengthened by having multiple tracks that add up to a single htopx visit as can happen for very long tracks where each attempt might only yield a half visit.
A way to improve the overhead accumulated for programs with long exposure times is to add double the above overhead to the requested visit length and make sure that time is shorter than the actual track length. This avoids the RA having to split requested visits between several different tracks.
The following links give the summary for each institution and its programs.
The resulting table will give (for each program) the total number of targets
in the queue and the number completed, the CCD shutter open
hours, average overhead for that program, and the TAC allocated time.
This usually will be the best
metric for judging completeness but there are times when a PI will tell
us that a target is "done" before the total number of visits is complete.
This is how each institution has allocated its time by priority.
Observing Programs Status
Program comments:
UT13-2-001: (Marshall)
UT13-2-002: (Cochran)
UT13-2-003: (Cochran)
UT13-2-004: (Gullikson)
UT13-2-005: (Oliveira)
UT13-2-006: (Afsar)
UT13-2-007: (Shetrone)
UT13-2-008: (Wheeler) LRS TOO
UT13-2-009: (Shetrone)
UT13-2-010: (Shetrone)
UT13-2-011: (Papovich)
UT13-2-012: (Gebhardt) LRS g2
UT13-2-013: (Afsar)
Program comments:
PSU13-2-001: (Wolszczan)
PSU13-2-002: (Gronwall) LRS g2 and g3
PSU13-2-003: (Fox) LRS TOO
PSU13-2-004: (Eracleous) LRS g3
PSU13-2-005: (Eracleous) LRS g2 and g3
PSU13-2-006: (Deshpande)
PSU13-2-007: (Deshpande) No PhaseII
PSU13-2-008: (Mahadevan)
PSU13-2-009: (Wade)
PSU13-2-010: (Wade)
PSU13-2-011: (Brandt) LRS g2
Program comments:
M13-2-001: (Saglia)
Program comments:
G13-2-001: (Kollatschny) LRS g2
G13-2-002: (Zechmeister)
G13-2-003: (Kollatschny) LRS g2
G13-2-004: (Kollatschny) LRS g2
Institution Status
Time Allocation by Institution (hours) | |||||
---|---|---|---|---|---|
Institution | Priority 0 | Priority 1 | Priority 2 | Priority 3 | Priority 4 |
PSU | 7.500 | 22.500 | 30.000 | 30.000 | 39.000 |
UT | 24.000 | 67.200 | 93.500 | 85.000 | 145.000 |
Munich | 3.800 | 3.800 | 7.600 | 7.600 | 0.000 |
Goetting | 0.000 | 7.600 | 7.50 | 6.500 | 0.000 |