GENEALOGY-DNA-L Archives

Archiver > GENEALOGY-DNA > 2002-01 > 1010101216


From: "Peter A. Kincaid" <>
Subject: Re: [DNA] Years per generation
Date: Thu, 03 Jan 2002 19:40:16 -0400
References: <a04310100b857d8b0e571@[194.125.131.79]>
In-Reply-To: <005c01c19388$552e0840$32284ed5@zog>


>The 20yr figure is used by academic DNA reserarchers who are not confining
>their interests to the recent past (hundreds of years) but right back to
>pre-history (tens-of-thousands of years) over which most of the generations
>will have been in more 'primative' societies where the breeding interval
>probably started a little earlier and certainly ended a lot earlier than
>they do in post AD1000 societies. In a primative agricultural or
>hunter-gatherer society a breeding period of 15-ish-to-25-ish is quite
>reasonably. To continue breeding into the mid-to-late thirties, as is common
>in western society post AD1000, would be very rare in a primative
>population.


On what do they base these assumptions except on
extrapolating modern data to primitive times. Genetically
speaking the primitive "man/woman" reached
puberty about the same time and ended about the same
time for women. The only birth control they had was
death or infertility so why would they not breed into the
late 30s. The infant mortality rate was certainly higher
but the human body still had the same ability to reach
the same age. Certainly logic says that most of modern
DNA is derived from those who did reach an older age
because they would have been able to outbreed the non
survivors. So how did they get the 20 year number?

For example, say the primitive population was 10,000
women with 80% reproducing from 14 to 30 and then
dying; 15% reproducing until 35 and 5% reproducing
until menopause (say 44). The later 20% while smaller
would have from 3 to 7 more children (assuming every
2 years a child born). Over time this numerical advantage
would be reflected as a majority in modern DNA. It would
also have a bearing on the years for a generation making
a 20 year generation not applicable for part of primitive time.

Secondly, while you raise a point about a modern
generation rate being different from that over thousands
of years, why do they use it for modern times? If it is
wrong for genealogists to use a 35 year generation and
project it back to primitive times, it is also wrong for
geneticists to project forward a lower rate to modern times.
If there are two rates then we should use what applies
for the appropriate time. I am interested in genealogy (ie.
modern time) so I should care less about the primitive
mutation rates. Thus, the labs who are dealing with
"genealogy" clients should reflect the modern rate in their
analysis (ie. 30 years+) rather than a primitive rate.

What the real answer to this whole issue probably is
that in primitive times it was lower say 20 years but has
crept up to say 35 years in modern days. If you are looking
at 140,000 years ago one could use say 20, 100,000 years ago say
24, 60,000 years ago say 28, 20,000 years ago say 32. On average
over the whole period you should not use 20 but a better
average such as 26.

My non academic food for thought!


Peter A. Kincaid
Hampton, NB, Canada


This thread: