GENEALOGY-DNA-L ArchivesArchiver > GENEALOGY-DNA > 2012-02 > 1329388644
From: "Anatole Klyosov" <>
Subject: Re: [DNA] Out of Africa
Date: Thu, 16 Feb 2012 05:37:24 -0500
> From: "Lawrence Mayka" <>
> There is another source of inaccuracy that particularly affects TMRCA
> calculations from the hunter-gatherer era: the timespan of a generation.
> The usual wisdom is that since the Neolithic, the average time separation
> between two patrilineal generations is roughly 30 years.
I have explained here many times a very simple and basic thing: in DNA
genealogy you operate not with generations per se, but with a product "kt",
which is a product of the generation length and the mutation rate constant.
That is why 30 years or 25 years or 100 years per generation is irrelevant,
because you experimentally determine "kt". You determine it, you PICK the
generation length, and you get the mutation rate constant. If you want
another generation length, no problem, pick it and adjust the mutation rate
constant. For example, the mutation rate constant 0.12
mutation/haplotype/generation for the 67 marker haplotypes is valid ONLY for
25 years per (conditional) generation. For the 22 marker ("slow") haplotypes
the mutation rate constant is 0.06 mutation/haplotype/generation. You can
read about it here: http://www.scirp.org/journal/aa/
> This issue affects SNP-based dating as well, if the SNP mutation rate is
> itself calculated on a per-generation basis.
Yes, it is a problem with the SNP-based dating, because they (e.g.,
Cruciani) make two independent assumptions - on the mutation rate constant
(huge margins of error) and generation length (huge margins of error,
actually such as between 16 and 35; in short, we do not know).