擅长:python、mysql、java
<p>当我们不关心顺序时,通常删除重复项的方法是<a href="https://stackoverflow.com/questions/7961363/removing-duplicates-in-lists">put them in a set</a></p>
<p>但是,我们不能按原样将<code>dict</code>放入一个集合,因为它们<a href="https://stackoverflow.com/questions/13264511/typeerror-unhashable-type-dict">aren't hashable</a>。我们可以使用这里给出的技巧,以散列形式保存dict数据(这允许集合自然地删除重复项),然后获取原始数据</p>
<pre><code>def remove_duplicates(dicts):
# A set made from the hashable equivalent of each dict.
unique = {frozenset(d.items()) for d in dicts}
# Now we go backwards, building a list from the dict equivalents.
return [dict(hashable) for hashable in unique]
</code></pre>