I am trying to apply Time of Use metering rates to calculate the dollar value of the energy produced by a solar panel array. (San Diego Gas and Electric's EV-TOU-5 rate, in particular). The value changes according to the time of day, day of week, and holidays during the year. This will be part of a larger model that factors in savings from EV charging at off-peak rates after midnight, etc. So just displaying kwh/month misses the full picture, which I'm trying to model.
Rather than jump into coding this from scratch with PVLIB, I would appreciate any pointers to existing software.
Related
I have a question regarding time conventions in pvlib.
As far as I understand, all the computations are made using instantaneous timestep convention. Instant weather will produce instant electric power.
However, in the weather models like GFS, GHI parameter is cumulated over the last hour. This makes inconsitant the solar radiation and astronomical parameters (zenith, azimuth...).
For example, if I take a look at the ERBS function, used to compute DHI and DHI from GHI :
df_dni_dhi_kt = pvlib.irradiance.erbs(ghi, zenith, ghi.index)
Here, all parameters are hourly timeseries, but the because of the convention, the output may be inaccurate.
ghi : cumulated radiation over last hour
zenith : zenith angle at exact hour (instantaneous)
ghi.index : hourly DateTimeIndex
At the end of the power conversion process, a shift is observed between observations and model (please don't care about the amplitude difference, only time shift matters).
Any idea about using cumulated GHI as input of the library ?
When using hourly data there definitely is a dilemma in how to calculate the solar position. The most common method is to calculate the solar position for the middle of the time step. This is definitely an improvement to using either the start or end of the hour (as shown in your example). However, around sunset and sunrise this poses an issue, as the sun may be below the horizon at the middle of the hour. Thus some calculate the sun position for the middle where the period is defined as the part of the hour where the sun is above the horizon - but that adds complexity.
There's a good discussion on the topic here: https://pvlib-python.readthedocs.io/en/stable/gallery/irradiance-transposition/plot_interval_transposition_error.html
I am working on NFT staking smart contract.
but the daily ROI amount should be changed by time goes.
For example, normal ROI is 3% daily, but after 3 days it goes down to 1.5% and after 9 days it becomes 0.
and can recover the ROI using additional purchase of ERC20 token or claimable rewards.
To sum up, main problem is to calculate daily ROI and staker should claim rewards any time.
How to approach this using solidity?
We are developing Trucking Management Software web app
for ifta calculation and reporting purposes, I need to calculate a route with 60-120 way points
no directions, just total mileage and mileage by state.
I tried researching but im not sure i can find it,
Can google handle so many way points without directions, can I get a mileage breakdown by state...
Please help
The Distance Matrix API fits your need best. Each request can take up to 25 destinations so you'd have to split up the route into batches of 25 or fewer. Note that the pricing is different if you want simple distances and traffic-independent travel times vs. if you want travel times that take into account traffic information. The API does not split up a leg that crosses state lines, so you'd need to insert waypoints for state borders if you want to divide your mileage tallies by state.
You could also use the Directions API (handles up to 25 waypoints between the origin and destination) but its pricing charges the higher price for any request that includes more than 10 waypoints, so it won't be as cost-effective as the Distance Matrix API.
I have 3 values which I don't need to compare the size of them to each other, just their trending nature. However, when I try to put them into a line chart/combo chart one value is always drowned out.
THe values are: Summation of Spending, Percentage of Turns, and Time To Sell (in days). So one month can have this:
Spend: 200,000
Turns: 9%
TTS: 107
This data is by month as well.
The one thing I am trying from the below article is stacking charts on each other. Which is ugly and causes me to lose some visibility on the chart in the back.
This article gets me close but not quite.
Stack Overflow Question
In the field of stock market technical analysis there is the concept of rectangular price congestion levels, that is: the price goes up and down essentially never breaking the previous high and low price levels for some time, forming the figure of a rectangule. E.g.: http://cf.ydcdn.net/1.0.0.25/images/invest/congestion%20area.jpg.
Edit: to me clearer: the stock as well as the forex market is made by sets of movements called "impulse" and "correction", the first one being in the direction of the current stock's trendand the other in the opposite. When the stock is moving in the direction of the trend, the impulse movement is always bigger than the following correction, but sometimes what happens is a that the correction end-up being at the same size of the impulse. So for example, in a stock with a positive trend, the impulse movement moved from price $10,00 to $15,00, and than a correction appeared dropping the price to $12,00. When the new impulse appeared, thought, instead of passing the previous high value ($15,00), it stooped exactly on it, being followed by a new correction that dropped the price exactly to the previous low price ($12,00). So now we may draw two paralel horizontal lines in the stock's graph: one in the $15,00 price and other in the $12,00, forming a channel where the price is "congestioned" inside. And if we draw two vertical bars in the extreme sides, we have a rectangle: one that has its top bar in the high level and other in the low one.
I'm trying to create an algorithm in C++/Qt capable of detecting such patterns with candlestick data inside a list container (using Qt -> QList), but currently I'm doing research to see if anybody knows about someone who already did such code so I save lots of efforts and time in developing such algorithm.
So my first question will be: does anybody knows and open-source code that can detect such figure? - Obviously doesn't have to be exactly in this conditions, but if there is a code that do a similar taks, only needing for me to do the adjustments, that would be fine.
In the other hand, how could I create such algorithm anyway? It's clear the the high spot is to detect the high and low levels and than just control when those levels are 'broken' to detect the end of the figure, but how could I do that in an efficient way? Today the best thing I'm able to do is to detect high-and-low levels using time as parameter (e.g. "the highest price in four candles", and this using a very expensive code.
Technical analysis is very vague and subjective, hard to code in a program when everyone sees different things in the same chart. A good start would be to use some cost function such as choosing levels that minimizing the sum of squared distances, which penalizes large deviations more than smaller ones.
You should use the idea of 'hysteresis' thresholding; you create a 4-level state machine for how the price breaks the low (L) or high (H) levels. (first time reaches new low level) L->L, (return to low level) H->L,(new high level) H->H, and then (return to high level) L->H.