如何使用 Python pandas 在 csv 文件中以 15 分钟的时间间隔插入数据?

问题描述

我正在处理的当前数据在数据点之间具有不相等的时间间隔。我正在尝试使用 Python 中的插值将数据转换为 15 分钟的时间间隔。这是 csv 文件的前几行: CSV file

我正在尝试使用日期时间列来执行此操作。我只想保留日期时间列和第 5 行代码中列出的列。这是我目前拥有的代码,我尝试重新采样然后插入数据:

df = pd.read_csv('C:/Users/elsam/Documents/Year 3/Final EN3300 Project/Machine Learning/Data/locations/asp/row/noaa/5 min NOAA_0_.csv')
df['datetime'] = pd.to_datetime(df['datetime'])
df.index = df['datetime']
df = df.resample(rule='15min',on='datetime').mean()
df['wd','ws','vis','drb','dpt','asp','rh','cc','precip','sNow'] = df['wd','sNow'].interpolate()

但是,我收到此错误消息:

Traceback (most recent call last):
  File "C:\Users\elsam\Documents\Year 3\Final EN3300 Project\Machine Learning\Code\venv\lib\site-packages\pandas\core\indexes\base.py",line 3080,in get_loc
    return self._engine.get_loc(casted_key)
  File "pandas\_libs\index.pyx",line 70,in pandas._libs.index.IndexEngine.get_loc
  File "pandas\_libs\index.pyx",line 101,in pandas._libs.index.IndexEngine.get_loc
  File "pandas\_libs\hashtable_class_helper.pxi",line 4554,in pandas._libs.hashtable.PyObjectHashTable.get_item
  File "pandas\_libs\hashtable_class_helper.pxi",line 4562,in pandas._libs.hashtable.PyObjectHashTable.get_item
KeyError: ('wd','sNow')

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\elsam\Documents\Year 3\Final EN3300 Project\Machine Learning\Code\Data Compilation.py",line 352,in <module>
    df['wd','sNow'].interpolate()
  File "C:\Users\elsam\Documents\Year 3\Final EN3300 Project\Machine Learning\Code\venv\lib\site-packages\pandas\core\frame.py",line 3024,in __getitem__
    indexer = self.columns.get_loc(key)
  File "C:\Users\elsam\Documents\Year 3\Final EN3300 Project\Machine Learning\Code\venv\lib\site-packages\pandas\core\indexes\base.py",line 3082,in get_loc
    raise KeyError(key) from err
KeyError: ('wd','sNow')

我不明白为什么我会收到“KeyError”,因为我肯定在 CSV 文件中的代码中列出了列。

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)