spacer link to MAST page spacer logo image spacer

NOTE: Most of the High Level Science Products are unavailable while unscheduled maintenance is being performed. They will be incrementally restored over the course of this week. We apologize for any inconvenience.

Results from Survey given in June 2008

There were 344 responses to the survey, down from the 366 responses received last year. Below is a plot showing the response rate for the past 6 years.

plot showing survey response rate over time

We solicited responses from all subscribers to the Archive Newsletter, those who had a GALEX CASJobs account, and to all those who had retrieved data from DADS in the previous 28 months. We also placed an invitation to participate in the survey on the MAST web pages.

With questions 1 through 6, we asked questions to characterize MAST usage and to determine the platforms and browsers MAST users are utilizing. The remaining questions attempt to solicit input concerning various tools and services MAST has either implemented or may plan to implement in the future. We provided several places for responders to enter comments and suggestions. We have selected a subset of the comments and included them in this summary. If several people had the same or similar response, we have only included one of them. In some cases, similar comments were included in two different comment fields. In some respects we find the comments the most useful part of the survey. We would really like to get these throughout the year as you discover needs or have questions. You may send suggestions to the archive help desk at archive@stsci.edu or call at 410-338-4547 between 9:00 AM and 5:00 PM eastern time. You may also wish to use the MAST Suggestion Box.

1. How often have you used MAST in the past 12 months?

plot showing how often responders use MAST

2. What missions did you access? (multiple choices permitted)

plot showing the missions used

3. Which operating system do you primarily use to access MAST?

operating systems used - histogram

4. Which browser do you primarily use for your work?

browsers used

5. Please rank the fraction of your MAST usage that is related to the following activities:

plot showing purposes MAST used for

6. Which kind of documentation about MAST would you most like to have available?

Comments:

  • I want PDF that I can download

  • highest preference would actually be a platform independent PDF manual (instead of paper and instead of web-based tutorials)

  • Would like to be able to print parts of manual on occasion

  • Better links to scientists specializing in missions

  • A helping forum

  • Wiki should be tried also, with access to all or through archive accounts if you are worried.

  • Need to find a guide on MAST in relation to other archives (which one is it best to go to for something?)

  • Web based is also helpful

  • Tool tips are annoying - perhaps pop-up help when clicking on something?

  • I find the movies useless.

7. The GALEX Map Browser was first deployed in October 2007. If you are familiar with this tool please check all that apply. I use or plan to use the GALEX Map Browser to:

Additional features desired or comments:

  • Click on stars to get their coordinates and fluxes...

  • The galex tool is next to useless - quicklooks analogous to the other missions should be set up.

  • Legend is terrible.

8. We have recently deployed the new GalexView page, which searches on either objects or sky tiles.

If you have used this tool, please tell us how useful you find it:

  • Need to be able to cross match many objects!

  • Will be useful when lists of targets can be queried at once

  • Have not explored it enough yet, but like it.

  • On the second release I could not get it to function

  • Also bad assumptive legends.

9.If you use scripts to access MAST data and/or catalogs, please list the language(s) used.

Please describe the kinds of tasks performed:

  • Accessing GALEX CasJobs.

  • Image extraction.

  • Retrieve DSS images for finding charts.

  • Automated retrieval by dataset name.

  • List-based retrievals of many files.

  • Automate retrieval of large data sets.

  • Search all the available data for a given object.

  • Catalog matching; finding objects with e.g., more than one filter HST images with WFPC2 and/or ACS and with a minimum exposure time in each filter and more than a minimum overlap.

  • Parsing info on coords & datasets.

10. If you are interested in using scripts to access the MAST archive, please tell us about the kinds of tasks for which you would like example scripts to be provided:

Responses:

  • Quick data access scripts: I work with local galaxies. it would be very nice to have script which give me all HST data for a particular object e.g. like: > getmast ngc1068 hst acs

  • Multi coordinate access.

  • Quick and accurate tables or statistics on types of observations; better able to discern (include/exclude) various exposure types

  • Time and wavelength selected data. E.g. specify observations available in multiple wavelengths for a specified time on an object or observations available at multiple times (and instruments) for a specified wavelength region.

  • Queries from html pages are very useful to construct webtables that contains already the MAST queries.

  • Automated requests *and* retrievals, that repeat on a specified schedule.

  • Retrieval by dataset name retrieval by proposal id.

  • Retrieve catalog of objects with a selection criteria retrieve a catalog of objects given a list of positions.

  • Search for / fetch a list of target [+ calibrations] Look for archive data of certain kind (e.g., list all the available archive spectra of galactic nuclei with certain constraints in the setup or target position).

  • Exposure time / error weighted combining spectra (FUSE,IUE).

  • Downloading specific lists of GALEX spectra in bulk. e.g. ~100 1-d spectra from a list of objids.

  • Submit retrieval request from script. Currently I can only get the dataset information from script but still have to retrieve them via web interface.

  • Retrieving data (IDL task would be great).

  • I'd like to see examples of how to access the archive with python. It would be useful to submit a list of coordinates, have mangitudes and extinctions returned to python, and then let my code run with these values. If I did a lot of work with the database, it appears worthwhile to learn SQL. It's hard to justify the effort if I use it just a few times a year. An interface with python would be much more useful. explanation: It's not clear which data are in the HLA.

  • Objects in a specified range of redshifts.

  • I download large amounts of data which is difficult to do through the web interface. It would be useful to have a script to search for the relevant data and transfer to the disks.

  • Image data 'thumbnail' extraction (HST).

  • Image acquisition and data analysis scripts.

  • There are cases that I know exact name of files I wish to have. But for now, MAST cannot handle this request directly. It would be great if you can have a option to download a file just by filename.

11.The Hubble Legacy Archive (HLA) had its DR1 data release in February 2008. The HLA provides immediate access to enhanced Hubble products such as more extensive composite images (e.g., stacked, color, mosaics) with improved astrometry, NICMOS grism extractions, and source lists. It also has advanced browsing capabilities and a footprint service.

Please tell us how frequently you have made use of the HLA since its release.

Comments:

  • Reduced GHRS+STIS spectra would be extremely useful.

  • It does not provide the fully stacked images, only the 'visit' stacks.

  • Don't trust automatic stacks.

  • WFPC2 associations are badly needed.

  • It's not clear which data are in the HLA.

  • I should start using it. Grism reductions have been hard in the past.

12. If you have used the HLA, please tell us how useful you find it overall. Please enter specific comments, suggestions, or features you would like to see added to the HLA in the box below.

Comments:

  • It would be useful to know whether a particular object is actually covered by any dataset. For example, by marking the search coordinates in the Footprint view.

  • Overall good, but more filtering (on filters, length of exposure etc) should be possible. Also, should have a way people can easily access the reduced data via wget/scripts.

  • 1. it takes time to figure out which images I need; 2. many times object coordinates are out of image; 3. sometimes images quality is poor (cosmic rays).

  • The user interface is awkward. Use the HST archive interface instead.

  • It's way to slow for a practical usage. Just like google earth.

13. If you have used the HLA, please tell us how useful various features were to you:

Comments:

  • Inventory view is far too wide for browsers. Use HST archive format (with sort capability) instead.

  • The footprint view is quite slow and cumbersome to use. It'd be nice if some information were available about the images when you scrolled over them -- its hard to connect the images overlaid with the data.

  • It would be useful to capture the footprint display as an image. Actually would be nice to have the aperture footprints as separate objects/layers, but I'd settle for an image.

14. MAST will host the archive for Kepler, a photometric search for exosolar planets (including a GO program for variable objects), which will be launched in early 2009. Do you plan to make use of Kepler data?

If you answered yes, please tell us what tool(s) you would like to have available for the analysis of Kepler photometric light curves:

  • Object search tied to SIMBAD quick look light curve images to gauge blending of sources

  • Flagged + classified variable stars.

  • A simple tool for recovering FITS images, extracted photometry or both would be very useful.

  • Online interactive fitting programme either with user-specified template or a set of templates.

  • Good PSF characterisation.

  • Ephemeris calculator - PDM/AoV - adjustable plotting intervals.

  • I might be interested in comparing these data to other photometric databases, like XMM-OM and Swift-UVOT. Any tools to make that easier i.e., what are you going to offer for an ID of the source?

  • Calibrated time correlated photometric data at full temporal resolution.

  • I expect to have/develop my own tools. It would of course to useful in general to have basic tools to plot a time series. Also to generate a power spectrum and plot that over a user specified frequency domain.

  • Time series analysis.

  • Identification of non-periodic variables, such as AGN or other transient sources, with easy cross-correlation with other surveys.

  • CasJobs

  • 1) Photometric light curves in a useful format (ie. time, flux, error) 2) Proper flux values 3) DVA and instrument signature removal but not real signal removed. 4) Access to Kepler calibration light curves.

  • Search of light curves following statistical criteria (possibly written in form of script by the user).

  • How about a standard period-finder tool, like what the variable star people use to fit Cepheid and RR Lyrae light curves?

  • A tool to remove correlated noise and to identify planetary transits.

  • It would be nice to be able to see raw and processed data along side measurements already made.

15. How can we do a better job of supporting your archive needs? Please suggest in the comment box other new features or improvements you would like to see.

  • I think to be a complete UV archive you should imclude XMM OM and Swift UVOT daat as well. Especially for the Swift UVOT there is already a large data set availble tat covers a significant section of the sky.

  • Searching for planetary targets (Jupiter, Mars, Europa, etc.) is a royal pain, since SIMBAD and NED don't resolve these names. Searches for data by these target names are often grossly incomplete, since full target names are often entered with extraneous info (e.g., IO-ECLIPSE01, JUPITER-CML180, etc.). One must know a priori what planetary dataset they want to find with this current MAST set up. Surely there must be a better way around this problem.

  • I like the IUE and FUSE search by spectral types or objects. It would be nice that this feature be available when searching HST spectroscopic data. 1. The Jupiter HST Data are difficult to get via MAST, especially observations made after 2002. What is the problem? 2. The preview of photos is very annoying: it should be possible to get that 'half compressed gif' mark away and the picture shown as it is without automatically changing the contrast. So, one has to redo every step again. Normally I want only to see what the photos show.

  • I found that to get a first impression of useful data, the legacy archive is very useful. It give me a very quick idea what data sets I actually need. Still, downloading them via MAST is much more convinient as I can select multiple datasets. It would be great if both could be combind such that one could select all wanted datasets in the legacy archive and then download them all at once. Also in MAST I can only combine the download for one Instrument. If - for example - I want to download ACS and WFPC2 data I have to go through two download steps.

  • Keyword/object-name searches with wildcards, partial matches. etc

  • I recently tried the HST preview for stis and found that v. helpful in displaying images where there are no spectra. I could see faint sources nicely sometimes; but when i got the actual data and displayed same, cosmic ray hits obscured the spectrum. Thus, MAST must have done an excellent job of removing CR hits in the NON-CR split image. I would like a clue on the display of how the data were processed for display.

  • In the Database Info section, a better description of the entries in each data table would be helpful in determining what information to extract.

  • Searches of sources often only show the first 10 results from a mission, without an obvious link to get the rest of the results. It'd be nice to put a link to do that.

  • Nice flash GalexView. over view of other MAST data in with the Galex data tiles.

  • PLEASE make the Galex search tool return useable quicklooks.

  • Easy downloading of individual GALEX spectra.

  • More search options. Options like filter, grating, etc. specific to images and spectra.

  • Information on Galex Spectroscopy tool was available until recently and now all evidence of it even existing is gone, without so much as a warning to users.

  • 1. As I have been asking for YEARS!... For HST data (and in particular NICMOS, it is unbeleivable combersom, and unreliable for completeness, not to be able to query for and retrieve data AT THE EXPOSURE LEVEL for ALL exposures! Seeing 'association' IDs as a result of MAST queries, with only 'singletons' showing up with IDs at the exposure level continues to be a big bone of contention and makes work and life difficult. Will this EVER get in? When I do q query I want to see the individual exposure IDs, NOT an 'association name'. AAt LEAST make this an option. PLEASE! (For the record, I think this is 8 or 9 years I have been asking about this. I KNOW it is a problem for other NICMOS users as well). 2. For NICMOS calibration data, not able to query for internal flats by being able to query on the calibration lamp status (not in the MAST accessable metadata) is a real pain. My collaborators and I are VERY greatful to the archive folks (and starview gurus who are sql savvy as I am not) who have helped us over the years to 'find' the raw calibration data we need - but it would be great to be able to do that ourselves.

  • I would like to have access to the different databases used by the archive. Access the different tables within the databases and list of the fields within these tables. I would like to be able to add any of use any of these data fields as output columns of the query.

  • More search options. Options like filter, grating, etc. specific to images and spectra.

  • Shortlisting datasets based on more specific header information like SERIAL, GAIN, and SHUTTER (HST WFPC2) would be helpful as well.

  • When I search for HST imaging for a particular object, not all of the entries in the archive come up, even though I know they exist. I end up having to find out the proposal ID another way then searching for that ID rather than the object. Furthermore I don't understand what the difference between the HST:WFPC2 and WFPC2_ASN entries is. Why do some data come up under the ASN heading but are not listed in the individual WFPC2 exposures list?

  • Provide selectable views of a search -- i frequently do not need to see all the information of a search in the immediate results.

  • Improved search capabilities (search by name in HST abstract, or by objects, e.g. galaxy clusters would return the whole list of galaxy clusters, etc.) -even for the data not yet in HLA, the indication that it exists would be very usefull.

  • Link to relevant documentation (e.g. instrument documents) from the HLA web pages.

  • This applies mainly to the HST archive, as HST has the largest number of instrument configurations. It would be great to have better control over the constraints used on the HST Search Form. As an example, I would like to find all STIS G140L observations taken with a specific aperture. This is currently possible, but I need to be aware of the exact aperture names etc. One could think of a dynamic web page, where I select the instrument in the first place, and the page then reloads the appropriate drop-down menus for gratings and apertures. This may become quite complicated for some instruments, but would be usefull...

  • I feel very strongly that science observations with the FGS instruments on HST should be available in reduced format through MAST. The instruments are quite unique (i.e. not a normal CCD camera) and therefore the reduction codes needed are unique. Installing the reduction 'package' (this term is a stretch) is very complicated and is possibly limited to use on Sun computers. The reduction process does not really involve any interactive decisions from the user. Thousands of HST orbits have been used for FGS observations. All of this together really points to the need for automatic delivery of reduced data products for the FGS. In particular, the FGS3 is not used for science observations anymore and this data could be 'closed out' in a similar manner as has been done for STIS. I have found the ability to get reduced STIS data incredibly useful in the last year, but have struggled to get FGS data without being an insider to the small group of people who have all the necessary tools.