Gamma-ray bursts (GRBs) are promising tools for tracing the formation of high-redshift stars, including the first generation. At very high redshifts the reverse shock emission lasts longer in the observer frame, and its importance for detection and analysis purposes relative to the forward shock increases. We consider two different models for the GRB environment, based on current ideas about the redshift dependence of gas properties in galaxies and primordial star formation. We calculate the observed flux as a function of the redshift and observer time for typical GRB afterglows, taking into account intergalactic photoionization and Lyα absorption opacity, as well as extinction by the Milky Way. The fluxes in the X-ray and near-IR bands are compared with the sensitivity of different detectors such as Chandra, XMM, Swift XRT, and the James Webb Space Telescope (JWST). Using standard assumptions, we find that Chandra, XMM, and Swift XRT can potentially detect GRBs in the X-ray band out to very high redshifts z ≳ 30. In the K and M bands, the JWST and ground-based telescopes are potentially able to detect GRBs even 1 day after the trigger out to z ∼ 16 and 33, if present. While the X-ray band is insensitive to the external density and to reverse shocks, the near-IR bands provide a sensitive tool for diagnosing both the environment and the reverse shock component.
All Science Journal Classification (ASJC) codes
- Astronomy and Astrophysics
- Space and Planetary Science