HET End-Trimester Report
First Period of 2009
December 1 - March 31

This report is composed of five sections:
Facility Status

In this section we will discuss the status of the HET facility and each instrument and any limitation to configurations that occurred during the period.


Observing Statistics

The following FWHM image quality statistics were taken from the LRS pre-imaging statistics and FIF acquisition guider for science operations in the night report.

For comparison here is the LRS pre-imaging quality for the same period of 2008

Here are the DIMM values reported in the night report.

Month by Month Summary

The following table gives the observing statistics for each month. The second column gives the fraction of the month that was spent attempting science (as opposed to engineering or instrument commissioning). Science time is defined to begin at 18 degree twilight or the first science target. Science time is defined to end at 18 degree twilight or the last science target. The fourth column gives the fraction of the possible science time (A) lost due to weather. The fifth through seventh columns give the amount of remaining science time (after removing weather losses) not spent attempting science targets. Please note that the first stack of the night often occurs before 18 degree twilight.

Month A:Fraction of the Time that was Possible Science B:Average Night Length C:Fraction of Total Science Time Lost due to Weather D:Fraction of Actual Science Time Spent with Shutter Open D:Fraction of Actual Science Time Lost due to Overhead E:Fraction of Actual Science Time Lost due to Calibrations D:Fraction of Actual Science Time Lost due to Alignment F:Fraction of Actual Science Time Lost due to Problems Fraction of Actual Science Time Not accounted for or Lost
December 0.872 10.95 0.19 0.39 0.20 0.02 0.01 0.35 0.03
January 1.000 10.76 0.26 0.64 0.29 0.02 0.02 0.03 0.01
February 0.962 10.16 0.31 0.57 0.28 0.01 0.07 0.07 0.01
March 0.919 9.29 0.44 0.47 0.28 0.03 0.05 0.16 0.01

The last column accounts for all of the minutes lost or not used. Some of this time is accounting errors, some is time not charged to any program due to operations inefficiency and some is due to uncharged overheads from observations that did not result in a fits file.

Details on Nightly cloud cover based on the TO's observations of the sky reported 3 times a night in the night report.:

Month Fraction of the Nights that were Clear Fraction of the Nights that were Mostly Clear Fraction of the Nights that were Partly Cloudy Fraction of the Nights that were Mostly Cloudy Fraction of the Nights that were Cloudy
December 0.27 0.07 0.37 0.10 0.07
January 0.29 0.29 0.26 0.10 0.03
February 0.29 0.18 0.29 0.11 0.08
March 0.19 0.23 0.29 0.10 0.13

Please note that the HET could be closed due to humidity, smoke or high dust count and still have a "Clear" statistic in the night report.

The following tables give a break down of all attempted visits as well as the category that each falls into.

Charged exposures
Number of TimesShutter Open (Hours)Type
1197251.6 A - Acceptable
5314.4 B - Acceptable but Border line conditions
724124.9 4 - Priority 4 visits (does not include 1/2 charge)
00.0 Q - charged but PI error
00.0 C - Acceptable by RA but PI rejects

Uncharged exposures
Number of TimesShutter Open (Hours)Type
275.8 I - Targets observed under otherwise idle conditions
41 7.2 E - Rejected by RA for Equipment Failure
174.6 H - Rejected for Human failure
20640.7 W - Rejected by RA for Weather
00.0 P - Rejected by PI and confirmed by RA
81.2 N - Rejected due to unknown cause

So this is a total of 59.5 hours of uncharged spectra with an additional possible 14.4 hours of spectra that may be rejected.

The following overhead statistics include slew, setup, readout and refocus between exposures (if there are multiple exposures per visit). In the summary page for each program the average setup time is calculated. The table below gives the average setup time for each instrument PER VISIT and the average and maximum COMPLETED science exposures and visits.

The "Exposure" is defined by when the CCD opens and closes. A "Visit" is the requested total CCD shutter open time during a track and might be made up of several "Exposures". "Visit" as defined here contains no overhead. To calculate one type of observing efficiency metric one might divide the "Visit" by the sum of "Visit" + "Overhead".

The average overhead per actual visit is the overhead for each acceptable priority 0-3 (not borderline, and with overheads > 4 minutes to avoid 2nd half of exposures with unrealisticly low overheads) science target. This number reflects how quickly we can move from object to object on average for each instrument, however, this statistic tends to weight the overhead for programs with large number of targets such as planet search programs.

The average overhead per requested visit is the total charged overhead per requested priority 0-3 visit averaged per program. To get this value we average the average overhead for each program as presented in the program status web pages. The average overhead per visit can be inflated by extra overhead charged for PI mistakes (such as bad finding charts or no targets found at sky detection limits) or for incomplete visits e.g. 2 visits of 1800s are done instead of 1 visit with a CRsplit of 2. The average overhead per visit can be deflated by the 15 minute cap applied to the HRS and MRS. This method tends to weight the overhead to programs with few targets and bad requested visit lengths, ie. very close to the track length.

Instrument Avg Overhead per Actual Visit(min)Avg Overhead per Requested Visit(min) Avg Exposure (sec)Max Exposure (sec)Avg Visit (sec)Max Visit (sec)
LRS 14.715.6 1029.22400 2004.43600
HRS 7.9 8.3 984.63480 1309.35580
MRS 7.89.21558.9 1762 3599.9 5286

NOTE: AS OF 2003-3 THE SETUP TIME FOR AN ATTEMPTED MRS OR HRS TARGET IS CAPPED AT 15 MINUTES. AS OF 2004-3 THE SETUP TIME FOR AN ATTEMPTED LRS TARGET IS CAPPED AT 20 MINUTES.

The overhead statistics can be shortened by multiple setups (each one counted as a separate visit) while on the same target as is the case for planet search programs. The overhead statistics can be lengthened by having multiple tracks that add up to a single htopx visit as can happen for very long tracks where each attempt might only yield a half visit.

A way to improve the overhead accumulated for programs with long exposure times is to add double the above overhead to the requested visit length and make sure that time is shorter than the actual track length. This avoids the RA having to split requested visits between several different tracks.

During this period we have had some trouble with LRS setups which have turned out to be training issues.


Observing Programs Status

The following links give the summary for each institution and its programs. The resulting table will give (for each program) the total number of targets in the queue and the number completed, the CCD shutter open hours, average overhead for that program, and the TAC allocated time. This usually will be the best metric for judging completeness but there are times when a PI will tell us that a target is "done" before the total number of visits is complete.


Institution Status

This is how each institution has allocated its time by priority.

Time Allocation by Institution (hours)
Institution Priority 0 Priority 1 Priority 2Priority 3 Priority 4
PSU 12.330 33.570 37.330 37.320 89.000
UT 19.000 73.250 83.000 79.500 80.000
Stanford 3.000 9.000 12.0000 12.000 4.000
Munich 9.000 9.000 18.000 15.000 0.000
Goetting 0.000 0.000 0.0 0.000 0.000

Goetting has not sent in an allocation.

Stanford and Goetting have under allocated time to their institutional share. Munich put in sufficient hours for this period but we could use more Munich targets to assist in their total deficit.

The following is a summary of the total charged time for each institution based on our htopx data base (for shutter open) and night reports (for overhead). It includes shutter open time (weighting priority four by half) and overhead.

Time Charged by Institution (hours)
-INST- Used THIS PERIOD % of All THIS PERIOD -TOTAL TO DATE- -% TO DATE- - % ALLOCATION TO DATE-
PSU 114.9 27.3 2344.928.428.52%
UT 225.5 53.7 4661.156.554.87%
Stanford 28.1 6.7 555.16.76.74%
Munich 31.5 7.5 383.54.64.96%
Goetting 20.3 4.8 305.23.74.91%


Total to Date starting from Oct 1999. Priority weighting for P4 began in the 2003-2 period.


The following is a histogram of the current HET queue visits at the end of the period.

The following histogram shows the bright time priority 0-4 contribution from each partner.

The following histogram shows the dark time priority 0-3 contribution from each partner.

The following histogram shows the use of LRS_g2, LRS_g3 and LRS_e2 for priority 0-2 programs in this period.

From the above plots I have determined that:

TAC Response