如何使用Pandas作为字符串从.CSV文件直接读取度分秒(DMS)数据到数据帧中?

2024-09-26 18:15:50 发布

您现在位置:Python中文网/ 问答频道 /正文

如何使用熊猫作为字符串从.CSV文件直接读取度分秒(DMS)数据到数据帧中? 例如: 如果我有一个csv文件,其中有一列DMS格式的数据,那么如何将其作为字符串读取到dataframe以进行进一步计算

76° 17' 51.2399" E
77° 26' 30.8322" E
76° 51' 29.7812" E
75° 45' 41.3540" E
76° 17' 51.2399" E

输入文件示例:enter link description here

当我使用pandas.read_csv('test.csv)#test.csv是输入文件 我犯了一个错误

Traceback (most recent call last):

  File "<ipython-input-90-2af7440e7795>", line 1, in <module>
    df = pd.read_csv('test.csv')

  File "C:\ProgramData\Anaconda3\envs\obspy\lib\site-packages\pandas\io\parsers.py", line 676, in parser_f
    return _read(filepath_or_buffer, kwds)

  File "C:\ProgramData\Anaconda3\envs\obspy\lib\site-packages\pandas\io\parsers.py", line 448, in _read
    parser = TextFileReader(fp_or_buf, **kwds)

  File "C:\ProgramData\Anaconda3\envs\obspy\lib\site-packages\pandas\io\parsers.py", line 880, in __init__
    self._make_engine(self.engine)

  File "C:\ProgramData\Anaconda3\envs\obspy\lib\site-packages\pandas\io\parsers.py", line 1114, in _make_engine
    self._engine = CParserWrapper(self.f, **self.options)

  File "C:\ProgramData\Anaconda3\envs\obspy\lib\site-packages\pandas\io\parsers.py", line 1891, in __init__
    self._reader = parsers.TextReader(src, **kwds)

  File "pandas\_libs\parsers.pyx", line 529, in pandas._libs.parsers.TextReader.__cinit__

  File "pandas\_libs\parsers.pyx", line 749, in pandas._libs.parsers.TextReader._get_header

UnicodeDecodeError: 'utf-8' codec can't decode byte 0xf8 in position 2: invalid start byte

Tags: csvinioselfpandaslibpackagesline
2条回答

尝试使用不同的编码:

import pandas as pd
df = pd.read_csv('test1.csv', encoding='windows-1252')

CSV文件:

id,string
1,76° 17' 51.2399" E
2,77° 26' 30.8322" E
3,76° 51' 29.7812" E
4,75° 45' 41.3540" E
5,76° 17' 51.2399" E

代码:

df = pd.read_csv('test.csv')
print(df)

   id              string
0   1  76° 17' 51.2399" E
1   2  77° 26' 30.8322" E
2   3  76° 51' 29.7812" E
3   4  75° 45' 41.3540" E
4   5  76° 17' 51.2399" E

你能分享你的CSV文件的样本吗

相关问题 更多 >

    热门问题