Listeners' accuracy in discriminating one temporal pattern from another was measured in three psychophysical experiments. When the standard pattern consisted of equally timed (isochronic brief tones, whose interonset intervals (IOIs) were 50, 100, or 200 msec, the accuracy in detecting an asynchrony or deviation of one tone in the sequence was about as would be predicted from older research on the discrimination of single time intervals (6%-8% at an IOI of 200 msec, 11%-12% at an IOI of 100 msec, and almost 20% at an IOI of 50 msec). In a series of 6 or 10 tones, this accuracy was independent of position of delay for IOIs of 100 and 200 msec. At 50 msec, however, accuracy depended on position, being worst in initial positions and best in final positions. When one tone in a series of six has a frequency different from the others, there is some evidence (at IOI = 200 msec) that interval discrimination is relatively poorer for the tone with the different frequency. Similarly, even if all tones have the same frequency but one interval in the series is made twice as long as the others, temporal discrimination is poorer for the tones bordering the longer interval, although this result is dependent on tempo or 101. Results with these temporally more complex patterns may be interpreted in part by applying the relative Weber ratio to the intervals before and after the delayed tone. Alternatively, these experiments may show the influence of accent on the temporal discrimination of individual tones.