Pyomo load function - pyomo

I have got a file named test.xls which looks like this:
All
Index MaxProd
Nuclear 300,0
DomesticCoal_Anthracite 588,0
BrownLignite 203,1
ImportedCoal_SubBituminous 150,4
ImportedCoal_Bituminous 194,4
CCGT_1 500,0
CCGT_2 500,0
CCGT_3 500,0
CCGT_4 667,5
OCGT_1 400,0
OCGT_2 400,0
OCGT_3 400,0
FuelOilGas 441,8
Where All (and rest) is in column A and MaxProd column B. But when I try to read the data in with my little script:
from pyomo.environ import *
model = AbstractModel()
model.Index = Set()
model.MaxProd = Param(model.Index)
data = DataPortal(model=model)
data.load(filename="Test.xls", range="All", format='set', set=model.Index)
It gives me the error message:
OSError: Unknown range name All'
Does anyone know what I am doing wrong?

Related

Django - Pulp Optimisation AttributeError: 'str' object has no attribute 'hash'

I am writing a script for pulp Optimisation to engage with my Django database. The problem contains a few thousand variables to be optimised and several hundred constraints which vary depending upon the values of a,b,c.
var_names[]
var_values{}
for _foo_ in list_1:
for _bar_ in list_2:
for _var_ in list_3:
for _eet_ list_4:
var_name = str(_foo_)+str(_bar_)+str(_var_)+str(_eet_)
var_names.append(var_name)
exec(str(_foo_)+str(_bar_)+str(_var_)+str(_eet_) + "= LpVariable("str(_foo_)+str(_bar_)+str(_var_)+str(_eet_)+", lowBound=0, cat='Integer')")
var_value = DataBase.objects.get(column_A = str(_foo_)+str(_var_)).value
var_values.append(var_value)
obj_func = LpAffineExpression([var_names[i],var_values[i] for in in range(len(var_names))])
problem = LpProblem(name="name", sense=LpMinimise)
#Example of constraints
exec("problem += (" + str(function(a1,b1,c1)_) +str(function(a1,b1,c2)) +" >= Database_2.objects.get(column_A = z1).value")
problem += obj_func
problem.sovle()
The code works in jupyter notebook when I load the database info as a dataframe. However, I keep receiving this following error code when using in Djagno:
File "/path/to/files/prob.py", line 1610, in <module>
problem.solve()
File "/path/to/files/lib/python3.9/site-packages/pulp/pulp.py", line 1913, in solve
status = solver.actualSolve(self, **kwargs)
File "/path/to/files/lib/python3.9/site-packages/pulp/apis/coin_api.py", line 137, in actualSolve
return self.solve_CBC(lp, **kwargs)
File "/path/to/files/lib/python3.9/site-packages/pulp/apis/coin_api.py", line 153, in solve_CBC
vs, variablesNames, constraintsNames, objectiveName = lp.writeMPS(
File "/path/to/files/lib/python3.9/site-packages/pulp/pulp.py", line 1782, in writeMPS
return mpslp.writeMPS(self, filename, mpsSense=mpsSense, rename=rename, mip=mip)
File "/path/to/files/lib/python3.9/site-packages/pulp/mps_lp.py", line 204, in writeMPS
constrNames, varNames, cobj.name = LpProblem.normalisedNames()
File "/path/to/files/lib/python3.9/site-packages/pulp/pulp.py", line 1546, in normalisedNames
_variables = self.variables()
File "/path/to/files/lib/python3.9/site-packages/pulp/pulp.py", line 1624, in variables
self.addVariables(list(self.objective.keys()))
File "/path/to/files/lib/python3.9/site-packages/pulp/pulp.py", line 1614, in addVariables
self.addVariable(v)
File "/path/to/files/lib/python3.9/site-packages/pulp/pulp.py", line 1603, in addVariable
if variable.hash not in self._variable_ids:
AttributeError: 'str' object has no attribute 'hash'
I have the code stored as .py file which is called views.py.
I believe it may be an issue with the namespace of the LpVariable creations. I have tried to:
define the whole problem encapsulated as a function with no entry variable and returns a dict of the solution.
Define the problem as a class with problem.create and problem.solve as methods to create the variables and solve the function.
Update the exec() code to store variables in the globals diction.
exec(str(_foo_)+str(_bar_)+str(_var_)+str(_eet_) + "= LpVariable("str(_foo_)+str(_bar_)+str(_var_)+str(_eet_)+", lowBound=0, cat='Integer')", globals())
And alternately creating a local dict and executing the above code with locals(),local_dict .
Used LpVariable.dict
variables = LpVariable.dicts("variables", [(_foo_, _bar_, _var_, _eet_), for _foo_ in list1 for _bar_ in list2 for _var_ in list3 for _eet_ in list4], lowBound=o, cat="Integer")
This does create all the variables, however the function used in the constraints references the variables as per the name str(foo)+str(bar)+str(var)+str(eet) and not variable[i] , which then generates undefined variable errors.
As mentioned, this code does work in jupyter, I am just at a loss as to what the error may be a result of.
LpAffineExpression expects LpVariable, not var_names[i].
var_items = []
# exec(str(_foo_)+str(_bar_)+str(_var_)+str(_eet_) + "= LpVariable(" + str(_foo_)+str(_bar_)+str(_var_)+str(_eet_)+", lowBound=0, cat='Integer')")
# var_value = DataBase.objects.get(column_A = str(_foo_)+str(_var_)).value
# var_values.append(var_value)
var_value = LpVariable(var_name, lowBound=0, cat='Integer')
var_coeff = DataBase.objects.get(column_A = str(_foo_)+str(_var_)).value
var_items.append((var_value, var_coeff))
# obj_func = LpAffineExpression([var_names[i],var_values[i] for i in range(len(var_names))])
obj_func = LpAffineExpression(var_items)
Reference: https://coin-or.github.io/pulp/technical/pulp.html#pulp.LpAffineExpression

python when there is no input for the subscriber it starts publishing error?/! how to solve it?

I have provided the following python code,but the problem is that when it doesnot receive any input, it start showing error. how can i modify the code in a way that this error dosnot appear:
#!/usr/bin/env python from roslib import message import rospy import sensor_msgs.point_cloud2 as pc2 from sensor_msgs.msg import PointCloud2, PointField import numpy as np import ros_numpy from geometry_msgs.msg import Pose
#listener def listen():
rospy.init_node('listen', anonymous=True)
rospy.Subscriber("/Filtered_points_x", PointCloud2, callback_kinect)
def callback_kinect(data):
pub = rospy.Publisher('lidar_distance',Pose, queue_size=10)
data_lidar = Pose()
xyz_array = ros_numpy.point_cloud2.pointcloud2_to_xyz_array(data)
print(xyz_array)
mini_data = min(xyz_array[:,0])
print("mini_data", mini_data)
data_lidar.position.x = mini_data
pub.publish(data_lidar)
print("data_points", data_lidar.position.x)
height = int (data.height / 2)
middle_x = int (data.width / 2)
middle = read_depth (middle_x, height, data) # do stuff with middle
def read_depth(width, height, data) :
if (height >= data.height) or (width >= data.width) :
return -1
data_out = pc2.read_points(data, field_names= ('x','y','z'), skip_nans=True, uvs=[[width, height]])
int_data = next(data_out)
rospy.loginfo("int_data " + str(int_data))
return int_data
if __name__ == '__main__':
try:
listen()
rospy.spin()
except rospy.ROSInterruptException:
pass
the following is the error that i mentioned:
[[ 7.99410915 1.36072445 -0.99567264]]
('mini_data', 7.994109153747559)
('data_points', 7.994109153747559)
[INFO] [1662109961.035894]: int_data (7.994109153747559, 1.3607244491577148, -0.9956726431846619)
[]
[ERROR] [1662109961.135572]: bad callback: <function callback_kinect at 0x7f9346d44230>
Traceback (most recent call last):
File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/topics.py", line 750, in _invoke_callback
cb(msg)
File "/home/masoumeh/catkin_ws/src/yocs_velocity_smoother/test4/distance_from_pointcloud.py", line 27, in callback_kinect
mini_data = min(xyz_array[:,0])
ValueError: min() arg is an empty sequence
The code is still receiving input, but specifically it’s receiving an empty array. You’re then trying to splice the empty array, causing the error. Instead you should check that the array has elements before the line min(xyz_array[:,0]). It can be as simple as:
if xyz_array == []:
return
As another note, you’re creating a publisher in the callback. You shouldn’t do this as a new publisher will be created every time it gets called. Instead create it as a global variable.

Could not fetch value from key inside list of dictionaries converted from BeautifulSoup ResultSet

I used BeautifulSoup to fetch a string representation of a list from https://api.huobi.pro/v1/common/symbols
Part of the string in the url looks like this:
{"status":"ok","data":[{"base-currency":"ont","quote-currency":"btc","price-precision":8,"amount-precision":4,"symbol-partition":"innovation","symbol":"ontbtc","state":"online","value-precision":8,"min-order-amt":0.01,"max-order-amt":100000,"min-order-value":0.0001,"leverage-ratio":2}, .. ]
This is how I scraped it into a ResultSet:
import requests
from bs4 import BeautifulSoup
import re
url = 'https://api.huobi.pro/v1/common/symbols'
res = requests.get(url)
html_page = res.content
soup = BeautifulSoup(html_page, 'html.parser',from_encoding='utf-8')
text = soup.find_all(text=True)
I then tried to convert the ResultSet into a list
huobiList = list(text)
Afterwards, I tried to print values from the symbol key in the list:
print([d[`symbol`] for d in huobiList])
but got this error:
print([d['symbol'] for d in huobiList])
TypeError: string indices must be integers
When I tried printing the key using indices, it reads the key as letters instead of reading the entire key name:
>>>print([d[0] for d in huobiList])
[u'{']
Actually there are some multiple problems here
symbol keys are inside the object you are trying to print, but below data key
And before that you have to cast you string to dict, using json.loads()
Check this code:
import requests
from bs4 import BeautifulSoup
import re
import json
url = 'https://api.huobi.pro/v1/common/symbols'
res = requests.get(url)
html_page = res.content
soup = BeautifulSoup(html_page, 'html.parser',from_encoding='utf-8')
text = soup.find_all(text=True)
huobiList = list(text)
datas = [json.loads(d).get('data') for d in huobiList]
for data in datas:
print([d.get('symbol') for d in data])
RESULTS
['bfteth', 'kncbtc', 'nexoeth', 'mexbtc', 'bsvhusd', 'paybtc', 'icxeth', 'aebtc', 'bixeth', 'forusdt', 'itcusdt', 'abtbtc', 'btgbtc', 'eosht', 'hitbtc', 'dogeusdt', 'yeebtc', 'buteth', 'thetausdt', 'etneth', 'akrousdt', 'acteth', 'egtusdt', 'pcbtc', 'astbtc', 'oneht', 'renusdt', 'shebtc', 'naseth', 'aidoceth', 'bhdht', 'steemeth', 'sspeth', 'neobtc', 'ostbtc', 'qtumusdt', 'dgdbtc', 'idtbtc', 'btteth', 'venbtc', 'eoseth', 'dtausdt', 'ncashbtc', 'xtzbtc', 'ltcht', 'rteeth', 'utketh', 'iostusdt', 'zlabtc', 'nodeusdt', 'gtcbtc', 'bsvbtc', 'cnneth', 'scbtc', 'venusdt', 'edubtc', 'neousdt', 'ruffbtc', 'xmreth', 'crobtc', 'btseth', 'iostbtc', 'mcobtc', 'yccbtc', 'bchbtc', 'bsvusdt', 'itcbtc', 'cvcoinbtc', 'xmrusdt', 'xtzusdt', 'qtumbtc', 'cnnsbtc', 'thetabtc', 'dogebtc', 'gasbtc', 'bcveth', 'ruffusdt', 'skmbtc', 'topceth', 'reneth', 'uceth', 'bixusdt', 'gnxbtc', 'zjlteth', 'btchusd', 'gtusdt', 'wxtusdt', 'btcusdt', 'ctxceth', 'seelebtc', 'sceth', 'edueth', 'tnteth', 'hptbtc', 'ogobtc', 'rdnbtc', 'egccbtc', 'ontusdt', 'datbtc', 'daceth', 'nulseth', 'cvcoineth', 'dtaeth', 'lymbtc', 'grsbtc', 'idteth', 'xvgeth', 'dgdeth', 'atombtc', 'getbtc', 'propybtc', 'irisbtc', 'adaeth', 'btmeth', 'hitusdt', 'engeth', 'meeteth', 'wprbtc', 'ontbtc', 'cdcbtc', 'ogousdt', 'mdsusdt', 'wavesusdt', 'xrpbtc', 'mtbtc', 'bcdbtc', 'blzbtc', 'maneth', 'quneth', 'nexobtc', 'gtbtc', 'ftibtc', 'ektusdt', 'zrxbtc', 'bkbtbtc', 'ekobtc', 'wavesbtc', 'gxcbtc', 'covabtc', 'toseth', 'dbcbtc', 'elfeth', 'icxbtc', 'sspbtc', 'swftcbtc', 'bifibtc', 'ethbtc', 'storjbtc', 'bftbtc', 'oneusdt', 'aeeth', 'cvceth', 'wtceth', 'eoshusd', 'payeth', 'pvtht', 'eosusdt', 'lxtusdt', 'bt1btc', 'newbtc', 'creht', 'xmxbtc', 'ckbusdt', 'docketh', 'hptusdt', 'mtlbtc', 'iostht', 'mdsbtc', 'zileth', 'npxseth', 'wiccbtc', 'hiteth', 'btmusdt', 'akroht', 'iosteth', 'gveeth', 'onebtc', 'emht', 'cvntbtc', 'ckbbtc', 'newusdt', 'gntbtc', 'eosbtc', 'chatbtc', 'etcbtc', 'engbtc', 'stkbtc', 'lskbtc', 'rtebtc', 'loombtc', 'sbtcbtc', 'qtumeth', 'trioeth', 'bixbtc', 'mtneth', 'xvgbtc', 'storjusdt', 'vidyht', 'srnbtc', 'skmusdt', 'veneth', 'mexeth', 'gaseth', 'vsysht', 'xzceth', 'cnnbtc', 'rsrht', 'xtzeth', 'pnteth', 'crousdt', 'itceth', 'bateth', 'zenbtc', 'fsnbtc', 'etchusd', 'mtxbtc', 'ncceth', 'qashbtc', 'rbtcbtc', 'fttbtc', 'ctxcbtc', 'kanusdt', 'egtbtc', 'nulsbtc', 'elfusdt', 'topcbtc', 'ltcusdt', 'wiccusdt', 'covaeth', 'smtbtc', 'elaeth', 'gtht', 'thetaeth', 'btttrx', 'ltchusd', 'ekoeth', 'dtabtc', 'letbtc', 'wtcusdt', 'nodeht', 'forbtc', 'hceth', 'gnxeth', 'kcashbtc', 'wpreth', 'gsceth', 'uipbtc', 'atpusdt', '18cbtc', 'aacbtc', 'bkbteth', 'blzeth', 'nulsusdt', 'rcneth', 'gtceth', 'omgusdt', 'lxtbtc', 'paxhusd', 'wxtht', 'ugasbtc', 'bchht', 'polybtc', 'btmbtc', 'soceth', 'hoteth', 'bhdusdt', 'atometh', 'letusdt', 'egcceth', 'atpbtc', 'waxpeth', 'ctxcusdt', 'fttusdt', 'lambbtc', 'xembtc', 'bhdbtc', 'ltcbtc', 'bcvbtc', 'tntbtc', 'embtc', 'nknusdt', 'zilbtc', 'wicceth', 'vidyusdt', 'pvtusdt', 'ardrbtc', 'ttusdt', 'dgbbtc', 'xlmeth', 'dashbtc', 'vetusdt', 'boxeth', 'kmdbtc', 'htbtc', 'waxpbtc', 'wtcbtc', 'swftceth', 'batusdt', 'ckbht', 'lambusdt', 'uipusdt', 'elausdt', 'xzcbtc', 'xemusdt', 'npxsbtc', 'smtusdt', 'adxeth', 'rsrbtc', 'dcreth', 'vsysbtc', 'kanbtc', 'loometh', 'trxusdt', 'elfbtc', 'mtnbtc', 'dbceth', 'omgbtc', 'linkusdt', 'lsketh', 'mxcbtc', 'musketh', 'manausdt', 'lxteth', 'uuueth', 'mxbtc', 'rsrusdt', 'skmht', 'socbtc', 'qspeth', 'trxbtc', 'gnteth', 'paibtc', 'gvebtc', 'srneth', 'dockbtc', 'usdchusd', 'evxbtc', 'hptht', 'mxusdt', 'paiusdt', 'qasheth', 'fsnusdt', 'gntusdt', 'topht', 'cvnteth', 'mtxeth', 'emusdt', 'xrpht', 'salteth', 'vetbtc', 'newht', 'rcccbtc', 'etcusdt', 'stketh', 'xzcusdt', 'ocneth', 'arpaht', 'leteth', 'smteth', 'vsysusdt', 'dockusdt', 'nccbtc', 'triobtc', 'nanobtc', 'zeneth', 'tnbeth', 'gscbtc', 'reqbtc', 'manabtc', 'lbaeth', 'portalbtc', 'lambeth', 'faireth', 'ugaseth', 'dashhusd', 'ethusdt', 'socusdt', 'croht', 'lolht', 'appcbtc', 'aaceth', 'xlmusdt', 'dashusdt', 'ethhusd', 'topusdt', 'wanbtc', 'batbtc', 'hcusdt', 'elabtc', 'fttht', 'bhtht', 'evxeth', 'zilusdt', 'htusdt', 'powrbtc', 'cnnsht', 'datxeth', 'hthusd', 'sncbtc', 'tusdhusd', 'phxbtc', 'linkbtc', 'cmteth', 'topbtc', 'uipeth', 'iicbtc', 'ektbtc', 'nknbtc', 'ttht', 'pvtbtc', 'algoeth', 'pceth', 'omgeth', 'luneth', 'adxbtc', 'iotabtc', 'zrxusdt', 'iotausdt', 'ocnbtc', 'sheeth', 'lbausdt', 'dgbeth', 'mxht', 'kaneth', 'aidocbtc', 'sntbtc', 'atomusdt', 'etnbtc', 'forht', 'xlmbtc', 'muskbtc', 'mtht', 'saltbtc', 'cmtusdt', 'gxcusdt', 'irisusdt', 'atpht', 'xrphusd', 'btsbtc', 'nknht', 'nanousdt', 'uuubtc', 'arpabtc', 'knceth', 'wxtbtc', 'xrpusdt', 'hotbtc', 'ardreth', 'polyeth', 'snceth', 'lambht', 'abteth', 'yeeeth', 'butbtc', 'hteth', 'kmdeth', 'arpausdt', 'bchhusd', 'cmtbtc', 'rcnbtc', 'asteth', 'bchusdt', 'ncasheth', 'nasusdt', 'qspbtc', 'dcrbtc', 'zecusdt', 'btsusdt', 'zechusd', 'bcxbtc', 'fsnht', 'paieth', '18ceth', 'uuuusdt', 'adabtc', 'bttusdt', 'dogeeth', 'ycceth', 'kcasheth', 'rccceth', 'egtht', 'steembtc', 'zlaeth', 'bt2btc', 'sntusdt', 'zecbtc', 'hcbtc', 'usdthusd', 'seeleusdt', 'ruffeth', 'nasbtc', 'bttbtc', 'seeleeth', 'reqeth', 'ocnusdt', 'lunbtc', 'veteth', 'kcashht', 'dateth', 'nanoeth', 'appceth', 'nodebtc', 'geteth', 'hb10usdt', 'zjltbtc', 'ucbtc', 'iotaeth', 'zrxeth', 'dcrusdt', 'ftieth', 'etcht', 'lymeth', 'xmrbtc', 'cvcusdt', 'crebtc', 'gxceth', 'iriseth', 'chateth', 'cnnsusdt', 'aeusdt', 'mcoeth', 'linketh', 'portaleth', 'waveseth', 'pntbtc', 'ttbtc', 'utkbtc', 'meetbtc', 'adausdt', 'fairbtc', 'actusdt', 'osteth', 'vidybtc', 'manaeth', 'cdceth', 'steemusdt', 'rdneth', 'qunbtc', 'mdseth', 'mteth', 'actbtc', 'bhtbtc', 'onteth', 'datxbtc', 'powreth', 'manbtc', 'trxeth', 'lolusdt', 'algobtc', 'lbabtc', 'propyeth', 'xmxeth', 'ogoht', 'waneth', 'tnbbtc', 'renbtc', 'bhtusdt', 'lolbtc', 'tosbtc', 'ekteth', 'dacbtc', 'creusdt', 'dashht', 'grseth', 'algousdt', 'boxbtc', 'cvcbtc', 'akrobtc', 'iiceth']

jmodelica optimization has runtime error

I am trying to follow different papers and tutorials to learn how to solve optimization problems of modelica modells.
In http://www.syscop.de/files/2015ss/events/opcon-thermal-systems/optimization_tool_chain_in_jmodelica.org_toivo_henningsson.pdf I found a very simple tutorial. But when I execute it I get some very open error messages.
I am using Python 2.7 with jupyther.
Here is my Notepad:
from pyjmi import transfer_optimization_problem
import matplotlib.pyplot as plt
import os.path
file_path = os.path.join("D:\Studies", "Integrator.mop")
op = transfer_optimization_problem('optI', file_path)
res = op.optimize()
t = res['time']
x = res['x']
u = res['u']
plt.plot(t,x,t,u)
My modelica file:
package Integrator
model Integrator
Real x(start=2, fixed = true);
input Real u;
equation
der(x) = -u;
end Integrator;
optimization optI(objective = finalTime, objectiveIntegrand = x^2 + u^2, startTime = 0, finalTime(free = true, min = 0.5, max = 2, initialGuess = 1))
Real x (start = 2, fixed = true);
input Real u;
equation
der(x) = -u;
constraint
u <= 2;
x(finalTime) = 0;
end optI;
end Integrator;
When I excute the code I get an RuntimeError, telling me that a java error occured and details where printed. From the Traceback I do not know what the note
This file is compatible with both classic and new-style classes
mean. I know that my setup is working because I executed the CSTR tutorial given by modelon. But now, it try to use my own models and it is giving me that error.
Runtime Error desciption
Using same syntax like in Modelica for import
e.g.
import Modelica.SIunits.Temperature;
where the package structure is part of the model-identification should resolve the issue.
op = transfer_optimization_problem('Integrator.optI', file_path)

Restoring a model with tensor flow: 'NoneType' object is not iterable

I am restoring an object with tensor flow. However, I am getting this error
return [dim.value for dim in self._dims]
TypeError: 'NoneType' object is not iterable
when I define the optimzer:
train = optimizer.minimize(lossBatch)
I tested the random generation of weights and it worked well.
def init_weights(shape):
return tf.Variable(tf.random_uniform(shape, -0.01, 0.01, seed=0))
So I am concluding that the problem is related to the restoration of weights.
To restore the weights I am doing this:
with tf.Session() as sess:
new_saver = tf.train.import_meta_graph('my-model-88500.meta')
new_saver.restore(sess, 'my-model-88500')
w_h1= tf.get_default_graph().get_tensor_by_name("w_h1:0")
b_h1 = tf.get_default_graph().get_tensor_by_name("b_h1:0")
w_h2 = tf.get_default_graph().get_tensor_by_name("w_h2:0")
b_h2 = tf.get_default_graph().get_tensor_by_name("b_h2:0")
w_h3 = tf.get_default_graph().get_tensor_by_name("w_h3:0")
b_h3 = tf.get_default_graph().get_tensor_by_name("b_h3:0")
w_o = tf.get_default_graph().get_tensor_by_name("w_o:0")
b_o = tf.get_default_graph().get_tensor_by_name("b_o:0")
w_h1=tf.reshape(w_h1,[numberInputs,numberHiddenUnits1],'w_h1')
b_h1=tf.reshape(b_h1,[numberHiddenUnits1],'b_h1')
w_h2=tf.reshape(w_h2,[numberHiddenUnits1,numberHiddenUnits2],'w_h2')
b_h2=tf.reshape(b_h2,[numberHiddenUnits2],'b_h2')
w_h3=tf.reshape(w_h3,[numberHiddenUnits2,numberHiddenUnits3],'w_h3')
b_h3=tf.reshape(b_h3,[numberHiddenUnits3],'b_h3')
w_o=tf.reshape(w_o,[numberHiddenUnits3,numberOutputs],'w_o')
b_o=tf.reshape(b_o,[numberOutputs],'b_o')
init = tf.initialize_all_variables()
sess.run(init)
Then I redefine the network:
numberEpochs=1500000
batchSize=25000
learningRate=0.000001
numberOutputs=np.shape(theTrainOutput)[1]
numberTrainSamples=np.shape(theTrainInput)[0]
numberInputs=np.shape(theTrainInput)[1]
xTrain=tf.placeholder("float",[numberTrainSamples,numberInputs])
yTrain=tf.placeholder("float",[numberTrainSamples,numberOutputs])
yTrainModel=model(xTrain,w_h1,b_h1,w_h2,b_h2,w_h3,b_h3,w_o,b_o)
xBatch=tf.placeholder("float",[batchSize,numberInputs])
yBatch=tf.placeholder("float",[batchSize,numberOutputs])
yBatchModel=model(xBatch,w_h1,b_h1,w_h2,b_h2,w_h3,b_h3,w_o,b_o)
lossBatch = tf.reduce_mean(tf.abs(yBatch-yBatchModel))
optimizer = tf.train.AdamOptimizer(learningRate)
train = optimizer.minimize(lossBatch)
I get an error in this last line above! Note that before I redefined the entire network to reuse the weights.
It is worth mentioning that I am able to get the shape of a weight, namely
w_h1.get_shape()
TensorShape([Dimension(13), Dimension(50)])
On the other hand,
w_h1.dtype
tf.float32
furthermore, I am also able to print the weights:
print sess.run(w_h1)