I am building an agent in Amazon Lex with around 3 intents. All the 3 intents have a variable which has been ticked as 'required', meaning the agent has to prompt for those variables when the user query is missing it.
However when I am using a lambda function as codehook validation the , function gets triggered without prompting for the missing variable.
For example: Intent which describes call notes from a call with a specific person:
The prompt is " Specify the name of the person whose notes you want to see'
The objective of the lambda function is to print out "Call notes for the person is XYZ'
When I don't use any lambda function through codehook validation, I get a prompt for the name of the person,
but when I use codehook validation, the lambda function gets triggered and I get the reply as "Call notes for None is XYZ".
None because , there wasn't any mention of the person's name in the user query nor am I getting prompted for the person's name.
Can anybody help regarding this? I have tried all sorts of modifications in the lambda function , but shouldn't the prompt be an independent functionality from the lambda function?
I have been browsing and trying things regarding this since 2~3 days and have hit a dead end.
This is happening because Lambda initialization and validation happens before Slot filling in Amazon Lex.
You can still check that user have provided the "person" slot in DialogCodeHook i.e validation part.
Something like below code will get your work done:
def build_validation_result(is_valid, violated_slot, message_content):
if message_content == None:
return {
'isValid': is_valid,
'violatedSlot': violated_slot
}
return {
'isValid': is_valid,
'violatedSlot': violated_slot,
'message': {'contentType': 'PlainText', 'content': message_content}
}
def validate_person(person):
# apply validations here
if person is None:
return build_validation_result(False, 'person', 'Please enter the person name')
return build_validation_result(True, None, None)
def get_notes(intent_request):
source = intent_request['invocationSource']
output_session_attributes = intent_request['sessionAttributes'] if intent_request['sessionAttributes'] is not None else {}
slots = intent_request['currentIntent']['slots']
person = slots['person']
if source == 'DialogCodeHook':
# check existence of "person" slot
validation_result = validate_person(person)
if not validation_result['isValid']:
slots[ticket_validation_result['violatedSlot']] = None
return elicit_slot(
output_session_attributes,
intent_request['currentIntent']['name'],
slots,
validation_result['violatedSlot'],
validation_result['message']
)
return delegate(output_session_attributes, slots)
If the "person" slot is empty, user will be prompted the error message. This way you don't need to tick on "Required" on the slot.
This was the only workaround I could think of when I faced this problem while using DialogCodeHook.
Hope it helps.
Related
I want to have a dynamic default value for a form. This will be whatever was inputted last time it was used.
I saw this answer that says to modify the default value after creation and then use the process() method. This works in updating it (I see the default displayed on the web page), but I get an error when the form submits saying The CSRF token is missing. In chrome devtools I see that the form has a input_id= 'csrf_token' and a value so not sure what is happening?
class weight(FlaskForm):
weight = StringField('Enter your weight', validators=[DataRequired()])
submit = SubmitField('Submit')
last_weight = get_last_weight() # query database function, returns float (e.g 85.4)
form = weight()
form.weight.default = last_weight
form.process()
A related question, is this the best way to set a dynamic default? I tried
StringField('Enter your weight', validators=[DataRequired()], default=get_last_weight())
as the docs say
default – The default value to assign to the field, if no form or object input is provided. May be a callable.
so I think the function is a callable (tell me if I'm wrong!) but I get an error saying that the function is being used outside of the application context, so I think the function is being called too early. It seems like this would be a common task so guessing I have missed something.
Thanks!
This is how I like to handle dynamic form building
def buildWeightForm(last_weight):
class weight(FlaskForm):
weight = StringField('Enter your weight', default=last_weight, validators=[DataRequired()])
submit = SubmitField('Submit')
return weight()
For your 2nd question, if you're getting an error doing this
StringField('Enter your weight', validators=[DataRequired()], default=get_last_weight())
Try the following (removing your parentheses from your function call):
StringField('Enter your weight', validators=[DataRequired()], default=get_last_weight)
In views, I have a function defined which is executed when the user submits the form online. After the form submission there are some database transactions that I perform and then based on the existing data in the database API's are triggered:
triggerapi():
execute API to send Email to the user and the administrator about
the submitted form
def databasetransactions():
check the data in the submitted form with the data in DB
if the last data submitted by the user is before 10 mins or more:
triggerapi()
def formsubmitted(request):
save the user input in variables
Databasetransactions()
save the data from the submitted form in the DB
In the above case, the user clicks on submit button 2 times in less than 5 milliseond duration. So 2 parallel data starts to process and both trigger Email which is not the desired behavior.
Is there a way to avoid this ? So that for a user session, the application should only accept the data once all the older data processing is completed ?
Since we are talking in pseudo-code, one way could be to use a singleton pattern for triggerapi() and return Not Allowed in case it is already istantiated.
There are multiple ways to solve this issue.
One of them would be to create a new session variable
request.session['activetransaction'] = True
This would however require you to pass request, unless it is already passed and we got a changed code portion. You can also add an instance/ class flag for it in the same way and check with it.
Another way, which might work if you need those submissions handled after the previous one, you can always add a while request.session['activetransaction']: and do the handling afterwards.
def formsubmitted(request):
if 'activetransaction' not in request.session or not request.session['activetransaction']:
request.session['activetransaction'] = True
# save the user input in variables
Databasetransactions()
# save the data from the submitted form in the DB
request.session['activetransaction'] = False
...
Unfortunately I'm using django-channels channels 1.1.8, as I missed all the
updates to channels 2.0. Upgrading now is unrealistic as we've just
launched and this will take some time to figure out correctly.
Here's my problem:
I'm using the *message.user.id *to differentiate between authenticated
users that I need to send messages to. However, there are cases where I'll
need to send messages to un-authenticated users as well - and that message
depends on an external API call. I have done this in ws_connect():
#channel_session_user_from_http
def ws_connect(message):
# create group for user
if str(message.user) == "AnonymousUser":
user_group = "AnonymousUser" + str(uuid.uuid4())
else:
user_group = str(message.user.id)
print(f"user group is {user_group}")
Group(user_group).add(message.reply_channel)
Group(user_group).send({"accept": True})
message.channel_session['get_user'] = user_group
This is only the first part of the issue, basically I'm appending a random
string to each AnonymousUser instance. But I can't find a way to access
this string from the request object in a view, in order to determine who
I am sending the message to.
Is this even achievable? Right now I'm not able to access anything set in
the ws_connect in my view.
EDIT: Following kagronick's advice, I tried this:
#channel_session_user_from_http
def ws_connect(message):
# create group for user
if str(message.user) == "AnonymousUser":
user_group = "AnonymousUser" + str(uuid.uuid4())
else:
user_group = str(message.user.id)
Group(user_group).add(message.reply_channel)
Group(user_group).send({"accept": True})
message.channel_session['get_user'] = user_group
message.http_session['get_user'] = user_group
print(message.http_session['get_user'])
message.http_session.save()
However, http_session is None when user is AnonymousUser. Other decorators didn't help.
Yes you can save to the session and access it in the view. But you need to use the http_session and not the channel session. Use the #http_session decorator or #channel_and_http_session. You may need to call message.http_session.save() (I don't remember, I'm on Channels 2 now.). But after that you will be able to see the user's group in the view.
Also, using a group for this is kind of overkill. If the group will only ever have 1 user, put the reply_channel in the session and do something like Channel(request.session['reply_channel']).send() in the view. That way it doesn't need to look up the one user that is in the group and can send directly to the user.
If this solves your problem please mark it as accepted.
EDIT: unfortunately this only works locally but not in production. when AnonymousUser, message.http_sesssion doesn't exist.
user kagronick got me on the right track, where he pointed that message has an http_session attribute. However, it seems http_session is always None in ws_connect when user is AnonymousUser, which defeats our purpose.
I've solved it by checking if the user is Anonymous in the view, and if he is, which means he doesn't have a session (or at least channels can't see it), initialize one, and assign the key get_user the value "AnonymousUser" + str(uuid.uuid4()) (this way previously done in the consumer).
After I did this, every time ws_connect is called message will have an http_session attribute: Either the user ID when one is logged in, or AnonymousUser-uuid.uuid4().
I need to scrap the data of each item from a website using Scrapy(http://example.com/itemview). I have a list of itemID and I need to pass it in a form in example.com.
There is no url change for each item. So for each request in my spider the url will always be the same. But the content will be different.
I don't wan't a for loop for handling each request. So i followed the below mentioned steps.
started spider with the above url
added item_scraped and spider_closed signals
passed through several functions
passed the scraped data to pipeline
trigerred the item_scraped signal
After this it automatically calls the spider_closed signal. But I want the above steps to be continued till the total itemID are finished.
class ExampleSpider(scrapy.Spider):
name = "example"
allowed_domains = ["example.com"]
itemIDs = [11111,22222,33333]
current_item_num = 0
def __init__(self, itemids=None, *args, **kwargs):
super(ExampleSpider, self).__init__(*args, **kwargs)
dispatcher.connect(self.item_scraped, signals.item_scraped)
dispatcher.connect(self.spider_closed, signals.spider_closed)
def spider_closed(self, spider):
self.driver.quit()
def start_requests(self):
request = self.make_requests_from_url('http://example.com/itemview')
yield request
def parse(self,response):
self.driver = webdriver.PhantomJS()
self.driver.get(response.url)
first_data = self.driver.find_element_by_xpath('//div[#id="itemview"]').text.strip()
yield Request(response.url,meta={'first_data':first_data},callback=self.processDetails,dont_filter=True)
def processDetails(self,response):
itemID = self.itemIDs[self.current_item_num]
..form submission with the current itemID goes here...
...the content of the page is updated with the given itemID...
yield Request(response.url,meta={'first_data':response.meta['first_data']},callback=self.processData,dont_filter=True)
def processData(self,response):
...some more scraping goes here...
item = ExamplecrawlerItem()
item['first_data'] = response.meta['first_data']
yield item
def item_scraped(self,item,response,spider):
self.current_item_num += 1
#i need to call the processDetails function here for the next itemID
#and the process needs to contine till the itemID finishes
self.parse(response)
My piepline:
class ExampleDBPipeline(object):
def process_item(self, item, spider):
MYCOLLECTION.insert(dict(item))
return
I wish I had an elegant solution to this. But instead it's a hackish way of calling the underlying classes.
self.crawler.engine.slot.scheduler.enqueue_request(scrapy.Request(url,self.yourCallBack))
However, you can yield a request after you yield the item and have it callback to self.processDetails. Simply add this to your processData function:
yield item
self.counter += 1
yield scrapy.Request(response.url,callback=self.processDetails,dont_filter=True, meta = {"your":"Dictionary"}
Also, PhantomJS can be nice and make your life easy, but it is slower than regular connections. If possible, find the request for json data or whatever makes the page unparseable without JS. To do so, open up chrome, right click, click inspect, go to the network tab, then enter the ID into the form, then look at the XHR or JS tabs for a JSON that has the data or next url you want. Most of the time, there will be some url made by adding the ID, if you can find it, you can just concatenate your urls and call that directly without having the cost of JS rendering. Sometimes it is randomized, or not there, but I've had fair success with it. You can then also use that to yield many requests at the same time without having to worry about phantomJS trying to do two things at once or having to initialize many instances of it. You could use tabs, but that is a pain.
Also, I would use a Queue of your IDs to ensure thread safety. Otherwise, you could have processDetails called twice on the same ID, though in the logic of your program everything seems to go linearly, which means you aren't using the concurrency capabilities of Scrapy and your program will go more slowly. To use Queue add:
import Queue
#go inside class definition and add
itemIDQueue = Queue.Queue()
#within __init__ add
[self.itemIDQueue.put(ID) for ID in self.itemID]
#within processDetails replace itemID = self.itemIDs[self.current_item_num] with
itemID = self.itemIDQueue.get()
And then there is no need to increment the counter and your program is thread safe.
I'm trying to get a unique value for a field (unique within the db column).
my code (other model fields omitted):
class PlatformUserChildren(models.Model):
dashboard = models.CharField('dashboard URL', max_length=64, unique=True, default=createDashboardCode(self))
def createDashboardCode(self):
stringCheck = False
while stringCheck is False:
newString = str(uuid.uuid4())[:32]
doesStringExist = newString in self.dashboard
if not doesStringExist:
stringCheck = True
return newString
I'm getting name 'self' is not defined as an error.
What should I be passing to the function so that I can check the db column to ensure the value is unique -or- is there a built-in way of doing this?
What I've already tried or looked at:
setting unique=True for the field and using default=uuid.uuid4 - that gives me duplicate values and generates a validation error (SO link)
I'm aware of Django 1.8's UUID field, however i'm on 1.7
The problem lies in the following line (indented for better readability) as you already know and mentioned before:
dashboard = models.CharField(
'dashboard URL',
max_length=64,
unique=True,
default=createDashboardCode(self)
)
In this part:
default=createDashboardCode(self)
you're calling the method createDashboardCode with the argument self. This is wrong, because you never pass self to a method as it is passed by Python. Whenever you call the method createDashboardCode you should do it this way:
createDashboardCode()
That's it, you're not passing the argument self explicitly.
You're getting an error "name 'self' is not defined" because self is not defined. There is no variable self in your code that you can pass to the method.
Now we're one step further, but your problem won't be solved if you just apply this slight change to your code.
The return value from the method createDashboardCode will be assigned to default. That's not what you really want. You have to assign a reference of the method to default:
default = createDashboardCode
Pay attention to the missing brackets. This will call the method every time a new instance of the model is created
Define a function:
def my_function():
print "hello"
and run it in the Python interpreter:
# once like this
my_function()
# and again like this
my_function
and you'll see the difference. That should help you to better comprehend this issue.