Hi we are using "dropnet" API and works fine until today, just is not working the UploadFile method, play media, move or delete works fine, but the upload is not working can someone help to figuredout why?
basically the process upload the file and returns no error, but the file (.mp3 files) is not uploaded into my dropbox account as had been doing until last Friday. We tried to move or delete a file, also play the mp3 from my web page and it works perfect! How can we try to catch the error?
we use this code:
var fecha = DateTime.Now.ToString("dd-MM-yyyy");
DropNet.Models.MetaData uploaded = new DropNet.Models.MetaData();
uploaded = _client.UploadFile("/CRM_Proximitas/" + portal + "/" + Request.QueryString["idReclamo"].ToString() + "/" + fecha, hidTipo.Value.ToString() + hidID.Value.ToString() + "_" + fecha + "_" + usuario + "." + uplArchivo.FileName.ToLower().Split('.')[1], uplArchivo.FileBytes);
if (uploaded != null)
{
if (hidTipo.Value.ToString() == "R")
{
objCommand.CommandText = "INSERT INTO ReclamoAudio (idReclamo,path,Autor,Fecha) VALUES (" + Request.QueryString["idReclamo"].ToString() + ",'" + uploaded.Path.ToString() + "'," + Session["UserID"].ToString() + ",getdate())";
}
else if (hidTipo.Value.ToString() == "T")
{
objCommand.CommandText = "INSERT INTO ReclamoAudio (idReclamo,idTransaccion,path,Autor,Fecha) VALUES (" + Request.QueryString["idReclamo"].ToString() + "," + hidID.Value.ToString() + ",'" + uploaded.Path.ToString() + "'," + Session["UserID"].ToString() + ",getdate())";
}
objCommand.CommandType = CommandType.Text;
objCon.Open();
drContacto = objCommand.ExecuteReader();
objCon.Close();
string codigo = "window.opener.location.reload();self.close();";
Page.ClientScript.RegisterStartupScript(this.GetType(), "FinalizarCarga", codigo, true);
}
We have a website with IIS 7 + Framework 2.0, also the Dropbox App now is in Development Status, this can we cause an issue?
Please I need your help.
Thanks a lot!
Related
Is there a way to export multiple SQL tables as csv by issuing specific queries from cloud-sql.
Below is the code i currently have. When I call the exportTables for multiple tables back to back, I see a 409 error. It's probably becaause cloud-sql instance is busy with an export and it's not allowing subsequent export request.
How can I get this to work ? What would be the ideal solution here.
private void exportTables(String table_name, String query)
throws IOException, InterruptedException {
HttpClient httpclient = new HttpClient();
PostMethod httppost =
new PostMethod(
"https://www.googleapis.com/sql/v1beta4/projects/"
+ "abc"
+ "/instances/"
+ "zxy"
+ "/export");
String destination_bucket =
String.join(
"/",
"gs://" + "test",
table_name,
DateTimeUtil.getCurrentDate() + ".csv");
GoogleCredentials credentials =
GoogleCredentials.getApplicationDefault().createScoped(SQLAdminScopes.all());
AccessToken access_token = credentials.refreshAccessToken();
access_token.getTokenValue();
httppost.addRequestHeader("Content-Type", "application/json");
httppost.addRequestHeader("Authorization", "Bearer " + access_token.getTokenValue());
String request =
"{"
+ " \"exportContext\": {"
+ " \"fileType\": \"CSV\","
+ " \"uri\":\""
+ destination_bucket
+ "\","
+ " \"databases\": [\""
+ "xyz"
+ "\"],"
+ " \"csvExportOptions\": {"
+ " \"selectQuery\": \""
+ query
+ "\""
+ " }\n"
+ " }"
+ "}";
httppost.setRequestEntity(new StringRequestEntity(request, "application/json", "UTF-8"));
httpclient.executeMethod(httppost);
if (httppost.getStatusCode() > 200) {
String response = new String(httppost.getResponseBody(), StandardCharsets.UTF_8);
if (httppost.getStatusCode() != 409) {
throw new RuntimeException(
"Exception occurred while exporting the table: " + table_name + " Error " + response);
} else {
throw new IOException("SQL instance seems to be busy at the moment. Please retry");
}
}
httppost.releaseConnection();
logger.info("Finished exporting table {} to {}", table_name, destination_bucket);
}
I don't have suggestion to fix the issue on Cloud SQL directly, but a solution to execute in sequence the export thanks to a new tool: Workflow
Define the data format that you want, in JSON, to define ONE export.
Then provide an array of configuration to your workflow
In this workflow,
Make a loops on the configuration array
Perform an API call to Cloud SQL to generate the export on each configuration
Get the answer of the API Call, you have the jobId
Sleep a while
Check if the export is over (with the jobId).
If not, sleep and check again
If yes, loop (and thus start the next export)
It's serverless and the free tier makes this use case free.
I have developed a web service using C# dot net coding which will connect to SharePoint online - and that web service I have hosted in one of the SharePoint on-premise IIS front end server - but I see some strange behavior. If we call that web service first time in a day using nintex workflow we get the below error:
**The GetListAttachmentsAndCopyToSPOnline workflow has ended unexpectedly for (no title).
For more information, please read this article: http://go.microsoft.com/fwlink/?LinkID=323543&clcid=0x409
Click here to view the workflow status.**
But once we restart the particular web service from IIS - everything start working perfectly. I mean in order to work with that web service every day at least once we need to restart the web service from the IIS....Not getting any clue why is this behavior? Any help would be much appreciated.
Sample web service code:
using (ClientContext ctx = new ClientContext(siteURL))
{
bool isSONumberFound = false;
string userName = "MyUserName";
string passWord = "MyPassword";
if (salesOrderNumber.Length <= 7)
{
salesOrderNumber = "000" + salesOrderNumber;
}
using (SecureString securedPassWord = new SecureString())
{
foreach (char c in passWord.ToCharArray()) securedPassWord.AppendChar(c);
ctx.Credentials = new SharePointOnlineCredentials(userName, securedPassWord);
Web web = ctx.Web;
try
{
ctx.Load(web);
ctx.ExecuteQuery();
List list = web.Lists.GetByTitle(documentLibrayName);
var camelQuery = new CamlQuery() { ViewXml = "<View><Query><Where><Eq><FieldRef Name='OrderID' /><Value Type='Text'>" + salesOrderNumber + "</Value></Where></Query></View>" };
var queryResults = list.GetItems(camelQuery);
ctx.Load(queryResults);
ctx.ExecuteQuery();
int countSalesOrder = queryResults.Count;
if (countSalesOrder >= 1)
{
isSONumberFound = true;
using (StreamWriter w = System.IO.File.AppendText(logFileName))
{
string logMessageSONotFound = "Sales Order Number: " + salesOrderNumber + "is found in the target document library: " + documentLibrayName + "of" + siteURL + ", so please enter the valid SO number";
Log(logMessageSONotFound, w);
w.Flush();
w.Close();
w.Close();
}
}
else
{
isSONumberFound = false;
using (StreamWriter w = System.IO.File.AppendText(logFileName))
{
string logMessageSONotFound = "Sales Order Number: " + salesOrderNumber + "is not found in the target document library: " + documentLibrayName + "of"+ siteURL+", so please enter the valid SO number";
Log(logMessageSONotFound, w);
w.Flush();
w.Close();
w.Close();
}
}
}
catch (Exception ex)
{
using (StreamWriter w = System.IO.File.AppendText(logFileName))
{
Log(ex.Message.ToString(), w);
}
}
}
return isSONumberFound;
}
}
}
}
function sendText(id,text) {
if(text == "hiii"){
var url = telegramUrl + "/sendMessage?chat_id=" + id + "&text=" + "sup?";
} else{
var url = telegramUrl + "/sendMessage?chat_id=" + id + "&text=" + "yo yo yo";
var response = UrlFetchApp.fetch(url);
Logger.log(response.getContentText());
}
}
My issue is that in Google Scripts (back end to Google Sheets), I have this function that reads in a message from telegram and, if the message reads "hiii" it should respond "sup?". Currently my code does not do this and instead executes only the else statement.
Using the standalone version of Postman, how do I output some debug information in Postman pre-request script?
The code below works great for tests (post-request) but not for pre-request since there is no tests[] array.
var jsonData = JSON.parse(responseBody);
tests["key = " + jsonData.key] = true; // debug message
tests["value = " + jsonData.value] = true; // debug message
The only way I could find to accomplish this is by using:
console.log("key = " + key);
console.log("value = " + value);
And then open up the Postman console (Cmd + Option + C / Ctrl + Alt + C) to view the debugs logs in a different window.
I'm trying to get my friend's relationship here, but i give all the right permissions and it still says me, undefined.. it doesn't extract the relationship status of a friend, nor the birthday.. here is my code:
function loadFriendsrel()
{
//get array of friends
FB.api('/me/friends?fields=name,first_name,gender,picture,relationship_status,birthday', function(response) {
console.log(response);
var divContainer=$('.facebook-friends');
var testdiv2 = document.getElementById("test2");
for(var i=0; i<response.data.length; i++){
if(response.data[i].gender == 'female'){
testdiv2.innerHTML += response.data[i].first_name + '<br/>' + response.data[i].relationship_status + '<br/>' + ' ' + '<img src="' + response.data[i].picture + '"/>' + '<br /> <br/>';
}
}
});
}
Even if you get all permission, you wont get relationship_status of users who have blocked them by their privacy settings.
Privacy settings have a higher priority than facebook api's.
So, in your loop, some friends may have blocked their relationship_status , so it gives undefined and breaks your loop.
Change your loop to something like this,
for(var i=0; i<response.data.length; i++){
if(response.data[i].gender == 'female'){
var relStatus = 'Relationship status not provided';
// If relationship_status exists, only then take its value
if('relationship_status' in response.data[i]){
relStatus = response.data[i].relationship_status;
}
testdiv2.innerHTML += response.data[i].first_name + '<br/>' + relStatus + '<br/>' + ' ' + '<img src="' + response.data[i].picture + '"/>' + '<br /> <br/>';
}
}
You can apply similar logic to other fields too.