Skip to content

Commit

Permalink
Merge branch 'master' into development
Browse files Browse the repository at this point in the history
  • Loading branch information
mdeceglie committed Jul 31, 2023
2 parents adf9f44 + 8013982 commit 1b29360
Show file tree
Hide file tree
Showing 30 changed files with 332 additions and 295 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/nbval.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ jobs:
- name: Install notebook environment
run: |
python -m pip install --upgrade pip wheel
pip install -r requirements.txt -r docs/notebook_requirements.txt .[test]
pip install --timeout=300 -r requirements.txt -r docs/notebook_requirements.txt .[test]
- name: Run notebook and check output
run: |
# --sanitize-with: pre-process text to remove irrelevant differences (e.g. warning filepaths)
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/pytest.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ jobs:
- name: Install ${{ matrix.env }}
run: |
python -m pip install --upgrade pip wheel
pip install ${{ matrix.env }}
pip install --timeout=300 ${{ matrix.env }}
- name: Test with pytest ${{ matrix.env }}
run: |
pytest
pytest
2 changes: 1 addition & 1 deletion .github/workflows/requirements.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@ jobs:
- name: Install notebook environment
run: |
python -m pip install --upgrade pip wheel
pip install -r requirements.txt -r docs/notebook_requirements.txt
pip install --timeout=300 -r requirements.txt -r docs/notebook_requirements.txt
58 changes: 29 additions & 29 deletions docs/TrendAnalysis_example_pvdaq4.ipynb

Large diffs are not rendered by default.

52 changes: 26 additions & 26 deletions docs/degradation_and_soiling_example.ipynb

Large diffs are not rendered by default.

272 changes: 121 additions & 151 deletions docs/degradation_and_soiling_example_pvdaq_4.ipynb

Large diffs are not rendered by default.

Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
8 changes: 4 additions & 4 deletions docs/notebook_requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -11,10 +11,10 @@ defusedxml==0.7.1
entrypoints==0.2.3
html5lib==1.0.1
ipykernel==4.8.2
ipython==7.16.3
ipython==8.10.0
ipython-genutils==0.2.0
ipywidgets==7.3.0
jedi==0.12.1
jedi==0.16.0
Jinja2==3.0.0
jsonschema==2.6.0
jupyter==1.0.0
Expand All @@ -32,11 +32,11 @@ nest-asyncio==1.5.5
notebook==6.4.12
numexpr==2.8.0
pandocfilters==1.4.2
parso==0.3.1
parso==0.5.2
pexpect==4.6.0
pickleshare==0.7.5
prometheus-client==0.3.0
prompt-toolkit==3.0.27
prompt-toolkit==3.0.30
ptyprocess==0.6.0
pycparser==2.20
Pygments==2.7.4
Expand Down
3 changes: 3 additions & 0 deletions docs/sphinx/source/changelog.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
RdTools Change Log
==================

.. include:: changelog/v2.2.0-beta.1.rst
.. include:: changelog/v2.1.6.rst
.. include:: changelog/v2.1.5.rst
.. include:: changelog/v2.1.4.rst
.. include:: changelog/v2.2.0-beta.0.rst
.. include:: changelog/v2.1.3.rst
Expand Down
13 changes: 13 additions & 0 deletions docs/sphinx/source/changelog/v2.1.5.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
*************************
v2.1.5 (May 16, 2023)
*************************

Bug Fixes
---------
* Add support for pandas 2.0 (:issue:`361`, :pull:`362`)


Contributors
------------
* Kevin Anderson (:ghuser:`kanderso-nrel`)
* Michael Deceglie (:ghuser:`mdeceglie`)
37 changes: 37 additions & 0 deletions docs/sphinx/source/changelog/v2.1.6.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
**********************
v2.1.6 (July 31, 2023)
**********************

Bug Fixes
---------
* Fix NonExistentTimeError in :py:func:`rdtools.clearsky_temperature.get_clearsky_tamb`
(:issue:`372` :pull:`373`)

Enhancements
------------
* :py:func:`rdtools.degradation.degradation_classical_decomposition` now
executes significantly faster. (:pull:`371`)

Requirements
------------
* Increased the minimum versions of several dependencies: (:pull:`371`)

+ pandas increased to 1.3.0 (released July 2, 2021)
+ numpy to 1.17.3 (released October 17, 2019)
+ statsmodels to 0.11.0 (released February 21, 2020)
+ scipy to 1.2.0 (released December 17, 2018)

* Add support for pvlib 0.10 (:pull:`378`)
* Updated notebook requirements (:pull:`360`)
* Bumps certifi from 2020.12.5 to 2022.12.7 (:pull:`357`)

Testing
-------
* Extended pip timeout (:pull:`360`)
* Updated example notebooks with new figure sizes (:pull:`360`)

Contributors
------------
* Michael Deceglie (:ghuser:`mdeceglie`)
* Bernat Nicolau (:ghuser:`BernatNicolau`)
* Kevin Anderson (:ghuser:`kandersolar`)
33 changes: 18 additions & 15 deletions docs/system_availability_example.ipynb

Large diffs are not rendered by default.

4 changes: 3 additions & 1 deletion rdtools/clearsky_temperature.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,9 @@ def get_clearsky_tamb(times, latitude, longitude, window_size=40,
freq_actual = times.freq

dt_daily = pd.date_range(times.date[0] - buffer, times.date[-1] + buffer,
freq='D', tz=times.tz)
freq='D')
dt_daily = dt_daily.tz_localize(times.tz, ambiguous='infer',
nonexistent='shift_forward')

f = h5py.File(filepath, "r")

Expand Down
26 changes: 8 additions & 18 deletions rdtools/degradation.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ def degradation_ols(energy_normalized, confidence_level=68.2):

# calculate a years column as x value for regression, ignoring leap years
day_diffs = (df.index - df.index[0])
df['days'] = day_diffs.astype('timedelta64[s]') / (60 * 60 * 24)
df['days'] = day_diffs / pd.Timedelta('1d')
df['years'] = df.days / 365.0

# add intercept-constant to the exogeneous variable
Expand Down Expand Up @@ -123,22 +123,14 @@ def degradation_classical_decomposition(energy_normalized,

# calculate a years column as x value for regression, ignoring leap years
day_diffs = (df.index - df.index[0])
df['days'] = day_diffs.astype('timedelta64[s]') / (60 * 60 * 24)
df['days'] = day_diffs / pd.Timedelta('1d')
df['years'] = df.days / 365.0

# Compute yearly rolling mean to isolate trend component using
# moving average
it = df.iterrows()
energy_ma = []
for i, row in it:
if row.years - 0.5 >= min(df.years) and \
row.years + 0.5 <= max(df.years):
roll = df[(df.years <= row.years + 0.5) &
(df.years >= row.years - 0.5)]
energy_ma.append(roll.energy_normalized.mean())
else:
energy_ma.append(np.nan)

energy_ma = df['energy_normalized'].rolling('365d', center=True).mean()
has_full_year = (df['years'] >= df['years'][0] + 0.5) & (df['years'] <= df['years'][-1] - 0.5)
energy_ma[~has_full_year] = np.nan
df['energy_ma'] = energy_ma

# add intercept-constant to the exogeneous variable
Expand Down Expand Up @@ -290,7 +282,7 @@ def degradation_year_on_year(energy_normalized, recenter=True,
tolerance=pd.Timedelta('8D')
)

df['time_diff_years'] = (df.dt - df.dt_right).astype('timedelta64[h]') / 8760.0
df['time_diff_years'] = (df.dt - df.dt_right) / pd.Timedelta('365d')
df['yoy'] = 100.0 * (df.energy - df.energy_right) / (df.time_diff_years)
df.index = df.dt

Expand Down Expand Up @@ -395,10 +387,8 @@ def _mk_test(x, alpha=0.05):
n = len(x)

# calculate S
s = 0
for k in range(n - 1):
for j in range(k + 1, n):
s += np.sign(x[j] - x[k])
x = np.array(x)
s = np.sum(np.triu(np.sign(-np.subtract.outer(x, x)), 1))

# calculate the unique data
unique_x = np.unique(x)
Expand Down
8 changes: 4 additions & 4 deletions rdtools/filtering.py
Original file line number Diff line number Diff line change
Expand Up @@ -424,8 +424,9 @@ def logic_clip_filter(power_ac,
# series sampling frequency is less than 95% consistent.
_check_data_sampling_frequency(power_ac)
# Get the sampling frequency of the time series
time_series_sampling_frequency = power_ac.index.to_series().diff()\
.astype('timedelta64[m]').mode()[0]
time_series_sampling_frequency = (
power_ac.index.to_series().diff() / pd.Timedelta('60s')
).mode()[0]
# Make copies of the original inputs for the cases that the data is
# changes for clipping evaluation
original_time_series_sampling_frequency = time_series_sampling_frequency
Expand Down Expand Up @@ -651,8 +652,7 @@ def xgboost_clip_filter(power_ac,
# series sampling frequency is less than 95% consistent.
_check_data_sampling_frequency(power_ac)
# Get the most common sampling frequency
sampling_frequency = int(power_ac.index.to_series().diff()
.astype('timedelta64[m]').mode()[0])
sampling_frequency = int((power_ac.index.to_series().diff() / pd.Timedelta('60s')).mode()[0])
freq_string = str(sampling_frequency) + "T"
# Min-max normalize
# Resample the series based on the most common sampling frequency
Expand Down
6 changes: 3 additions & 3 deletions rdtools/test/analysis_chains_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -495,11 +495,11 @@ def test_srr_soiling(soiling_analysis_sensor):
ci = srr_results['sratio_confidence_interval']
renorm_factor = srr_results['calc_info']['renormalizing_factor']
print(f'soiling ci:{ci}')
assert 0.965 == pytest.approx(sratio, abs=1e-3),\
assert 0.965 == pytest.approx(sratio, abs=1e-3), \
'Soiling ratio different from expected value in TrendAnalysis.srr_soiling'
assert [0.96, 0.97] == pytest.approx(ci, abs=1e-2),\
assert [0.96, 0.97] == pytest.approx(ci, abs=1e-2), \
'Soiling confidence interval different from expected value in TrendAnalysis.srr_soiling'
assert 0.974 == pytest.approx(renorm_factor, abs=1e-3),\
assert 0.974 == pytest.approx(renorm_factor, abs=1e-3), \
'Renormalization factor different from expected value in TrendAnalysis.srr_soiling'


Expand Down
2 changes: 1 addition & 1 deletion rdtools/test/availability_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -162,7 +162,7 @@ def difficult_data():

# generate a plausible clear-sky power signal
times = pd.date_range('2019-01-01', '2019-01-06', freq='15min',
tz='US/Eastern', closed='left')
tz='US/Eastern')
location = pvlib.location.Location(40, -80)
clearsky = location.get_clearsky(times, model='haurwitz')
# just scale GHI to power for simplicity
Expand Down
20 changes: 20 additions & 0 deletions rdtools/test/clearsky_temperature_test.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
import pytest
import datetime
import pandas as pd


from rdtools.clearsky_temperature import get_clearsky_tamb


Expand Down Expand Up @@ -35,3 +37,21 @@ def test_not_on_land():
with pytest.warns(UserWarning, match='possibly invalid Lat/Lon coordinates'):
ocean_cs_tamb = get_clearsky_tamb(dt, 40, -60)
assert ocean_cs_tamb.isnull().all()


def test_with_tricky_timezones():
# Some timezones have DST shifts at midnight, which
# can lead to NonExistentTimeError. This tests for the
# problem in issue #372

tz = 'America/Santiago'
start_date = datetime.datetime(2018, 8, 10, 0, 0, 0)
end_date = datetime.datetime(2018, 8, 14, 23, 0, 0)
freq = 'H'
lat = -24
lon = -70

times = pd.date_range(start=start_date, end=end_date, freq=freq)
times = times.tz_localize(tz=tz, ambiguous='infer',
nonexistent='shift_forward')
get_clearsky_tamb(times, lat, lon)
2 changes: 1 addition & 1 deletion rdtools/test/degradation_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ def get_corr_energy(cls, rd, input_freq):
freq = input_freq

x = pd.date_range(start=start, end=end, freq=freq)
day_deltas = (x - x[0]).astype('timedelta64[s]') / (60.0 * 60.0 * 24)
day_deltas = (x - x[0]) / pd.Timedelta('1d')
noise = (np.random.rand(len(day_deltas)) - 0.5) / 1e3

y = 1 + daily_rd * day_deltas + noise
Expand Down
Loading

0 comments on commit 1b29360

Please sign in to comment.