Twenty-six codes were added to the ASCL in March:
ASTERIX: X-ray Data Processing System
BAOlab: Image processing program
CCDPACK: CCD Data Reduction Package
CHIMERA: Core-collapse supernovae simulation code
computePk: Power spectrum computation
disc2vel: Tangential and radial velocity components derivation
GAIA: Graphical Astronomy and Image Analysis Tool
GPU-D: Generating cosmological microlensing magnification maps
GRay: Massive parallel ODE integrator
Inverse Beta: Inverse cumulative density function (CDF) of a Beta distribution
ISAP: ISO Spectral Analysis Package
JAM: Jeans Anisotropic MGE modeling method
KAPPA: Kernel Applications Package
KINEMETRY: Analysis of 2D maps of kinematic moments of LOSVD
Lightcone: Light-cone generating script
MGE_FIT_SECTORS: Multi-Gaussian Expansion fits to galaxy images
MLZ: Machine Learning for photo-Z
pyExtinction: Atmospheric extinction
RMHB: Hierarchical Reverberation Mapping
SLALIB: A Positional Astronomy Library
SOFA: Standards of Fundamental Astronomy
SURF: Submm User Reduction Facility
T(dust) as a function of sSFR
Unified EOS for neutron stars
Viewpoints: Fast interactive linked plotting of large multivariate data sets
YNOGKM: Time-like geodesics in the Kerr-Newmann Spacetime calculations
And seventeen codes were added to the ASCL in April:
AMBIG: Automated Ambiguity-Resolution Code
AST: World Coordinate Systems in Astronomy
CAP_LOESS_1D & CAP_LOESS_2D: Recover mean trends from noisy data
carma_pack: MCMC sampler for Bayesian inference
Comet: Multifunction VOEvent broker
LTS_LINEFIT & LTS_PLANEFIT: LTS fit of lines or planes
macula: Model of rotational modulations of a spotted star
RegPT: Regularized cosmological power spectrum
SAS: Science Analysis System for XMM-Newton observatory
SER: Subpixel Event Repositioning Algorithms
SpecPro: Astronomical spectra viewer and analyzer
Spextool: Spectral EXtraction tool
TORUS: Radiation transport and hydrodynamics code
TTVFast: Transit timing inversion
VictoriaReginaModels: Stellar evolutionary tracks
WFC3UV_GC: WFC3 UVIS geometric-distortion correction
ZDCF: Z-Transformed Discrete Correlation Function
Improvements are coming to the ASCL; we don't have a firm timeline yet but expect to have the majority of changes made well before the end of the year. The presentation below shows screenshots of the changes; we hope you like what you see.
The biggest changes are that code entries will move from the APOD discussion forum and will be housed in a new database. We have been running the new database in parallel with the existing ASCL and are getting closer to putting the new database into production. We are integrating our current technologies -- this WordPress site for our general information and blog and the phpbb for announcements and discussion for individual codes -- into our new infrastructure as well.
Current URLs for code entries will continue to work after implementation of the new system. We will likely be making changes in several phases, and will announce them before and after here and on our social media sites.
Please let us know what you think; thanks!
I'm delighted to offer the following guest post by Jonathan Petters, Data Management Consultant, Johns Hopkins Data Management Services, and thank him very much for it!
In a recent discussion on preservation and sharing of research data, a few participants expressed their concern (paraphrased here) that “My research community doesn't know how to create a quality data management plan” and “We don't know how to evaluate data management plans.” The astronomy community explicitly requested a little guidance. We in Johns Hopkins University Data Management Services have developed a few resources, described below, of use in both developing and evaluating data management plans within all research disciplines, including astronomy.
Funding agencies have long encouraged and expected that data and code used in the course of funded research be made available to those in the research discipline. NSF is an important funder of astronomical research that has such expectations (and the agency I will focus on here). A few years ago NSF began requiring data management plans as part of research proposal, in part to aid in the dissemination and sharing of research data and code. Following a February 2013 Office of Science and Technology Policy memo other US funding agencies are expected to follow suit with similar data management plan requirements, including the Department of Energy's Office of Science.
What does NSF say about writing and evaluating quality data management plans? A good overview of NSF data policies relevant for the AST community can be found in these slides from Daniel Katz, NSF). In general the National Science Foundation (NSF) states that data management will be defined by “the communities of interest.” The NSF AST-specific policy further states “MPS Divisions will rely heavily on the merit review process in this initial phase to determine those types of plan that best serve each community and update the information accordingly.” Neither statement is especially prescriptive and can leave researchers unclear as to what they should do.
Creating a plan
While effective research data management certainly has community- and discipline-specific attributes, there ARE aspects of effective data management that are generalizable across research disciplines. It is around these general aspects that we in Johns Hopkins University Data Management Services (JHUDMS) devised our Data Management Planning Questionnaire. We work through this questionnaire with researchers at Johns Hopkins to help them create effective data management plans.
The Questionnaire is designed to comprehensively hit upon the important aspects of effective research data management (e.g. data inputs/outputs in the research, ethical/legal compliance, standards and formats used, intended sharing and preservation, PI restrictions on the use of the data). By answering the applicable questions in the document, removing the questions/front matter and connecting the answers in each section into paragraphs, a researcher would be well on their way to a quality, well thought-out data management plan.
Two relevant side-notes:
1.) For the Questionnaire we consider code and software tools as one 'kind' of research data; thus analysis or simulation codes used in the course of your proposed research should be included as a Data Product. While research code and research data generated or processed by code are clearly NOT the same, there are many similarities in managing the two. In both cases effective management should include consideration of documentation, licensing, formats, associated metadata, and upon what platform(s) the data or code could be shared.
2.) Astronomy, as in other disciplines, conducts a substantial amount of research through large collaborations (e.g. surrounding HST or SDSS data). In these cases it is typical for investments in research data infrastructure to be made, and data policies/practices to be defined for those working with the data. Citing those policies and practices in a data management plan would be appropriate.

Evaluating a plan
To help researchers evaluate data management plans for their quality, my colleagues developed the Reviewer Guide and Worksheet for Data Management Plans (dotx). This Guide and Worksheet is a complement to our Questionnaire; it is a handy checklist by which a grant reviewer can determine whether a researcher thoroughly considered the important aspects of research data management.
For those who researchers saying to themselves, “The Questionnaire and Reviewer Guide are nice, but PLEASE just tell me what to do!!!”, I found two tweets from the code sharing session at the latest (223rd) AAS meeting in January to be quite relevant (h/t August Muench and Lucianne Walkowicz):
![]() |
![]() |
I wholeheartedly agree with both tweets. It is up to the research community members to police and enforce the data management and sharing practices they would like to see in their community. That’s how peer review works! So the next time you review astronomical research proposals, look over the data management plans carefully and bring up relevant thoughts and concerns to the review panel.
Summing up
I hope the Data Management Planning Questionnaire and Reviewer Guide and Worksheet for Data Management Plans help you and other researchers in the astronomy community more fully develop expectations for data management and sharing practices. It’s likely your institution also has research data management personnel (like the JHUDMS at Hopkins) who are more than happy to help!
Mozilla Science Lab, GitHub and Figshare team up to fix the citation of code in academia
The Mozilla Science Lab, GitHub and Figshare – a repository where academics can upload, share and cite their research materials – is starting to tackle the problem. The trio have developed a system so researchers can easily sync their GitHub releases with a Figshare account. It creates a Digital Object Identifier (DOI) automatically, which can then be referenced and checked by other people.
Discussion of the above article on YCombinator
...it always make me cringe when privately held companies want to define an "open standard" for scientific citations that (surprise!) relies completely on their proprietary infrastructure. I still remember the case of Mendeley, which promised to build an open repository for research documents, and which is now a subsidiary of Elsevier, an organization that does not really embrace "open science", to put it mildly.
Tool developed at CERN makes software citation easier
Researchers working at CERN have developed a tool that allows source code from the popular software development site GitHub to be preserved and cited through the CERN-hosted online repository Zenodo....
Now, people working on software in GitHub will be able to ensure that their code is not only preserved through Zenodo, but is also provided with a unique digital object identifier (DOI), just like an academic paper.
Webcite
WebCite is an on-demand archiving system for webreferences (cited webpages and websites, or other kinds of Internet-accessible digital objects), which can be used by authors, editors, and publishers of scholarly papers and books, to ensure that cited webmaterial will remain available to readers in the future.
DOIs unambiguously and persistently identify published, trustworthy, citable online scholarly literature. Right?
So DOIs unambiguously and persistently identify published, trustworthy, citable online scholarly literature. Right? Wrong.
The examples above are useful because they help elucidate some misconceptions about the DOI itself, the nature of the DOI registration agencies and, in particular issues being raised by new RAs and new DOI allocation models.
Thirty-five codes were added to the ASCL in February:
Aladin Lite: Lightweight sky atlas for browsers
ANAigm: Analytic model for attenuation by the intergalactic medium
ARTIST: Adaptable Radiative Transfer Innovations for Submillimeter Telescopes
astroplotlib: Astronomical library of plots
athena: Tree code for second-order correlation functions
BAOlab: Baryon Acoustic Oscillations software
BF_dist: Busy Function fitting
CASSIS: Interactive spectrum analyzer
Commander 2: Bayesian CMB component separation and analysis
CPL: Common Pipeline Library
Darth Fader: Galaxy catalog cleaning method for redshift estimation
DexM: Semi-numerical simulations for very large scales
FAMA: Fast Automatic MOOG Analysis
GalSim: Modular galaxy image simulation toolkit
Glue: Linked data visualizations across multiple files
gyrfalcON: N-body code
HALOFIT: Nonlinear distribution of cosmological mass and galaxies
HydraLens: Gravitational lens model generator
KROME: Chemistry package for astrophysical simulations
libsharp: Library for spherical harmonic transforms
MGHalofit: Modified Gravity extension of Halofit
Munipack: General astronomical image processing software
P2SAD: Particle Phase Space Average Density
PyGFit: Python Galaxy Fitter
PyVO: Python access to the Virtual Observatory
PyWiFeS: Wide Field Spectrograph data reduction pipeline
QUICKCV: Cosmic variance calculator
QuickReduce: Data reduction pipeline for the WIYN One Degree Imager
SPLAT-VO: Spectral Analysis Tool for the Virtual Observatory
SPLAT: Spectral Analysis Tool
TARDIS: Temperature And Radiative Diffusion In Supernovae
UVMULTIFIT: Fitting astronomical radio interferometric data
Vissage: ALMA VO Desktop Viewer
wssa_utils: WSSA 12 micron dust map utilities
XNS: Axisymmetric equilibrium configuration of neutron stars
The ASCL has 779 codes in it now, some of which date back to the 1990s. With the speed at which both the web and code authors (often grad students or post docs) move, links to some code sites are bound to go bad over time. We use a checker regularly to test links to ensure we're not pointing to dead links; when we do find a broken link (defined as one we haven't been able to reach for at least 2 weeks), we look for a new one and, if that doesn't work, email the code author(s) to ask where the code has moved.
We can't always find a good link, and code authors sometimes don't reply to our emails. Currently, eight codes -- 1% of our entries -- have bad links. Of these, for half of them we either cannot find the code author or the code author has not replied to numerous emails.
What else can we do?
I assume that some code authors forget their codes. Having moved on perhaps to another institution and other work, they do not have time nor incentive to create a new web home for a code they wrote some years ago. That's understandable, but then the code, a unique solution to a problem, an artifact of astrophysics research, a method used in research, is lost.
We'd like to save the codes (Save the Codes! I may have to put that on glow-in-the-dark pencils); here are a few ideas for authors who no longer want to maintain a site for their codes:
I don't know about option 4, but options 1-3 should take 15 minutes or less. Surely a code is worth that little bit of extra time to make it available to others even if you don't want to be bothered with it anymore.
Please save your code; don't let it go bad!
There are currently 768 codes registered in the ASCL; the percentages of codes hosted on different popular sites are:
GitHub: 4.17%
SourceForge: 3.78%
Code.Google: 1.96%
Bitbucket: 0.52%
That means 11% of codes indexed by the ASCL are hosted on a public site conducive to social programming. That's higher than the 7% from two years ago (by coincidence, almost exactly two years ago) and not unexpected, given the growth of GitHub. Fewer than 1% of ASCL codes were in GitHub two years ago (only 3 at that time -- wow!); now there are 32 hosted on GitHub. For comparison, there were 14 codes on SourceForge two years ago, so while that number has doubled, the growth in use of GitHub is obviously much greater.
Though stored on sites conducive to collaboration, most of these codes are not big collaborations; the majority of codes in the ASCL in these repositories have 4 or fewer authors.
I expect the percentage of codes on such sites to grow as more people use these tools for versioning; I think those who use such tools may also be more open to sharing their codes and advertising them (via links in papers if nothing else), making them easier to find/register in the ASCL, too.
"Each developer holds copyright in his or her code the moment it is written, and because all the world’s major copyright systems—including the US after 1976—do not require notices, publishing code without a copyright notice doesn’t change this."1
In the recent code sharing session at the AAS 223 meeting, both Alberto Accomazzi and David Hogg mentioned the difficulty of dealing with code that did not carry any license, copyright notice, nor sometimes even author information with it. Such code is difficult to share for transparency, reuse, or expansion. Letting people know whether and how they can use your code and/or share it is a kindness not just to them, but to the community and even yourself, whether you want to retain copyright on the code, choose one of the copyleft licenses, or make your code public domain.
Just beginning to think about licensing and trying to wrap your head around it? TechSoup offers a good introduction on licensing in Making Sense of Software Licensing, and I've previously mentioned A Quick Guide to Software Licensing for the Scientist-Programmer from PLoS in our list of general articles that may be of interest to astronomical software users.
If you already know you want an open source license for your open source software (OSS) but don't know which to choose, the Choose a license site describes different popular open source licenses; it is a good resource for getting an overview of each of them. The Open Source Initiative also offers information on licenses and has a FAQ that is useful for clarifying such terms as copyleft, public domain, open source, and free software in addition to others one runs across when considering licensing.
Interested in retaining copyright within a collaborative free software project? This white paper from the Software Freedom Law Center identifies best practices for doing so. And if you're thinking about changing a code's license, you may want to read Bruce Berriman's informative post, with plenty of resources in it, on his Astronomy Computing Today blog.
What resources have you found helpful for licensing? I am very interested in knowing, and hope you will please share them; thank you!
1 http://softwarefreedom.org/resources/2012/ManagingCopyrightInformation.html
"...some of the greatest artifacts of the [astronomy] community’s creative problem-solving are at risk of being lost."
I believe this; a good thing, since this is what Peter Teuben and I wrote in We didn’t see this coming: Our unexpected roles as software archivists and what we learned at Preserving.exe, one of three participant reports in "Preserving.exe: Toward a National Strategy for Software Preservation."
This report arose from a summit held at the Library of Congress on May 20-21, 2013 by the National Digital Information Infrastructure and Preservation Program. Our piece discusses the summit itself, some of what we learned there, and its impact on the way we think about the ASCL and our work. Among the ideas raised at the summit was that of software as a cultural artifact. We wrote:
The Summit broadened our view and appreciation for software as a cultural artifact and as a method of capturing creativity in problem-solving.
Now we see the loss of computational methods that result in research as a loss of part of astronomy’s cultural heritage. This isn’t happening just for astronomy, of course; the Summit made clear that it is happening for everything. With so much rendered digitally, whether born that way or migrated to a digital medium, without preserving the digital artifacts and the software (and sometimes hardware) to lift these artifacts from their digital storage, we risk losing our art, our music, our games, our prose, our data, and our histories, of daily life and activities, of solutions to scientific problems, of popular pastimes and play experiences, and even knowledge of our computer worries and angst.
More on what we learned at the summit is available in the full report, which includes excellent pieces by participants Henry Lowood, Stanford University (The Lures of Software Preservation) and Matthew Kirschenbaum, University of Maryland (An Executable Past: The Case for a National Software Registry), an introduction by Trevor Owens, Library of Congress, and interviews of Doug White of the National Institute of Standards and Technology's National Software Reference Library and Michael Mansfield from the Smithsonian American Art Museum.
PreservingEXE: Toward a National Strategy for Software Preservation
It's not just astrophysics; other sciences are also grappling with issues surrounding software release, transparency of research, and collaboratively sharing codes.
The challenge of software licensing came up in the AAS 223 Special Session on code sharing; ASCL advisor Bruce Berriman followed up on this issue with a post on Astronomy Computing Today, and I've recently run across A Quick Guide to Software Licensing for the Scientist-Programmer, which also offers some guidance on this important issue.