In a recent paper (Dai & Lu, ApJL, 519, L155), we propose a simple model in which the observed steepening in the light curve of the R-band afterglow of GRB 990123 is caused by the adiabatic shock which has evolved from an ultrarelativistic phase to a nonrelativistic phase in a dense medium. We find that such a model is quite consistent with observations including HST detections if the medium density is about . In another recent paper (astro-ph/9906109), we discuss this model in more details. In particular, we investigate the effects of synchrotron self absorption and energy injection. A shock in a dense medium becomes nonrelativistic rapidly after a short relativistic phase. The afterglow from the shock at the nonrelativistic stage decays more rapidly than at the relativistic stage. Since some models for GRB energy sources predict that a strongly magnetic millisecond pulsar may be born during the formation of GRB, we discuss the effect of such a pulsar on the evolution of the nonrelativistic shock through magnetic dipole radiation. We find that after the energy which the shock obtains from the pulsar is much more than the initial energy of the shock, the afterglow decay will flatten significantly. When the stellar effect disappears, the decay will steepen again. These features are in excellent agreement with the afterglow of GRB 980519. Dense media in the vicinity of GRBs, ``dirty" environments, appear in the models related with the collapse of massive stars.
Fifth Huntsville Gamma Ray Burst Symposium
Hunsville, Alabama, USA
18-22 October, 1999