Titanium Alloy Caching in Android/iOS? Or Preserving old views - web-services

Can we Cache Dynamically Created Lists or View till the webservices are called in background. I want to achieve something like the FaceBook App does. I know its possible in Android Core but wanted to try it in Titanium (Android and IOS).
I would further explain it,
Consider I have a app which has a list. Now When I open for first time, it will obviously hit the webservice and create a dynamic list.
Now I close the app and again open the app. The old list should be visible till the webservice provides any data.

Yes Titanium can do this. You should use a global variable like Ti.App.myList if it is just an array / a list / a variable. If you need to store more complex data like images or databases you should use the built-in file system. There is a really good Documentation on the Appcelerator website.
The procedure for you would be as follows:
Load your data for the first time
Store your data in your preferred way (Global variable, file system)
During future app starts read out your local list / data and display it until your sync is successfull.
You should consider to implement some variable to check wether any update is needed to minimize the network use (it saves energy and provides a better user experience if the users internet connection is slow).
if (response.state == "SUCCESS") {
Ti.API.info("Themes successfully checked");
Ti.API.info("RESPONSE TEST: " + response.value);
//Create a map of the layout names(as keys) and the corresponding url (as value).
var newImageMap = {};
for (var key in response.value) {
var url = response.value[key];
var filename = key + ".jpg"; //EDIT your type of the image
newImageMap[filename] = url;
}
if (Ti.App.ImageMap.length > 0) {
//Check for removed layouts
for (var image in Ti.App.imageMap) {
if (image in newImageMap) {
Ti.API.info("The image " + image + " is already in the local map");
//Do nothing
} else {
//Delete the removed layout
Ti.API.info("The image " + image + " is deleted from the local map");
delete Ti.App.imageMap[image];
}
}
//Check for new images
for (var image in newImageMap) {
if (image in Ti.App.imageMap) {
Ti.API.info("The image " + image + " is already in the local map");
//Do nothing
} else {
Ti.API.info("The image " + image + " is put into the local map");
//Put new image in local map
Ti.App.imageMap[image] = newImageMap[image];
}
}
} else {
Ti.App.imageMap = newImageMap;
}
//Check wether the file already exists
for (var key in response.value) {
var url = response.value[key];
var filename = key + ".png"; //EDIT YOUR FILE TYPE
Ti.API.info("URL: " + url);
Ti.API.info("FILENAME: " + filename);
imagesOrder[imagesOrder.length] = filename.match(/\d+/)[0]; //THIS SAVES THE FIRST NUMBER IN YOUR FILENAME AS ID
//Case1: download a new image
var file = Ti.Filesystem.getFile(Ti.Filesystem.resourcesDirectory, "/media/" + filename);
if (file.exists()) {
// Do nothing
Titanium.API.info("File " + filename + " exists");
} else {
// Create the HTTP client to download the asset.
var xhr = Ti.Network.createHTTPClient();
xhr.onload = function() {
if (xhr.status == 200) {
// On successful load, take that image file we tried to grab before and
// save the remote image data to it.
Titanium.API.info("Successfully loaded");
file.write(xhr.responseData);
Titanium.API.info(file);
Titanium.API.info(file.getName());
};
};
// Issuing a GET request to the remote URL
xhr.open('GET', url);
// Finally, sending the request out.
xhr.send();
}
}
In addition to this code which should be placed in a success method of an API call, you need a global variable Ti.App.imageMap to store the map of keys and the corresponding urls. I guess you have to change the code a bit to fit your needs and your project but it should give you a good starting point.

Related

Google Storageclient.ListObjects causes thread cancel when deployed to server

I have some code that fetches files from Google Cloud Storage. The code works fine when run on my local development machine but when deployed to our production server it always "stops" whenever the parameter "objectsInBucket" (in the code below) is used.
In the example below the last code that actually executes correctly when deployed to the server is the line "LogHelper.LogToConsole("-6").
If I "uncomment" the foreach loop the last output to console is "- 4". Also, if I for example make a variable like "var count = objectsInBucket.Count();" and put it immediately after "objectsInBucket = storageClient.Listobjects(_gcsBucketName);" then the last output will be "- 1".
But like I said, this problem only occurs when deployed to the server. So what can the cause of this be?
Google.Api.Gax.PagedEnumerable<Google.Apis.Storage.v1.Data.Objects, Google.Apis.Storage.v1.Data.Object> objectsInBucket = null;
LogHelper.LogToConsole($" - 1");
objectsInBucket = storageClient.ListObjects(_gcsBucketName);
LogHelper.LogToConsole($" - 2");
//var count = objectsInBucket.Count(); // this causes last output to be "- 2"
var dirPath = Path.Combine(_gcsAttachemntPath, attachmentId);
LogHelper.LogToConsole($" - 3");
if (objectsInBucket != null)
{
LogHelper.LogToConsole($" - 4");
//LogHelper.LogToConsole($" - {objectsInBucket.Count()} attachments exists on bucket:");
//foreach (var obj in objectsInBucket)
//{
// LogHelper.LogToConsole($" - - {obj.Name}");
//}
LogHelper.LogToConsole($" - 5");
var directoryInfo = new DirectoryInfo(dirPath);
if (directoryInfo.Exists)
{
LogHelper.LogToConsole($" - Deleting directory: {dirPath}");
directoryInfo.Delete(true);
}
directoryInfo.Create();
LogHelper.LogToConsole($" - Directory created: {dirPath}");
}
LogHelper.LogToConsole($" - 6");
var attachmentFiles = objectsInBucket.Where(x => x.Name.Contains(attachmentId));
From the code you shared It seems you want to achieve this: If an object exists in the folder you want to delete the object and create a subfolder in the same folder.If it does not exist create a subfolder directly in the folder..I am not much experienced in C# the below code may help you i think.
StorageClient storageClient = StorageClient.Create(projectId);
string bucketName = "bucketName";
string folderName = "folderToCreate";
string objectName = "ObjectToCheckToExist";
bool objectExists = storageClient.GetObject(bucketName, $"{folderName}/{objectName}") != null;
if (objectExists){
storageClient.DeleteObject(bucketName, $"{folderName}/{objectName}");
storageClient.DeleteFolder(bucketName, folderName);
}
else {
storageClient.CreateFolder(bucketName, folderName);
}
Also have a look at official document as well.
answers were much appreciated.
It turns out that for some reason this call was blocked in the firewall on the server.
We were made available a proxy to use when doing these calls.
Then, when using the proxy, everything worked fine.

PowerBi Api - How to get GroupId and DatasetId of Dashboard via API

I have been reading https://powerbi.microsoft.com/en-us/blog/announcing-data-refresh-apis-in-the-power-bi-service/
In this post, it mentions "To get the group ID and dataset ID, you can make a separate API call".
Does anybody know how to do this from the dashboard URL, or do I have to embed the group id and dataset id in my app alongside the dashboard URL???
To get the group ID and dataset ID, you can make a separate API call.
This sentence isn't related to a dashboard, because in one dashboard you can put visuals showing data from many different datasets. These different API calls are Get Groups (to get list of groups, find the one you want and read it's id) and Get Datasets In Group (to find the dataset you are looking for and read it's id).
But you should already know the groupId anyway, because the dashboard is in the same group.
Eventually, you can get datasetId from particular tile using Get Tiles In Group, but I do not know a way to list tiles in dashboard using the Rest API.
This is a C# project code to get the dataset id from Power BI.
Use the below method to call the 'Get' API and fetch you the dataset Id.
public void GetDatasetDetails()
{
HttpResponseMessage response = null;
HttpContent responseContent = null;
string strContent = "";
PowerBIDataset ds = null;
string serviceURL = "https://api.powerbi.com/v1.0/myorg/admin/datasets";
Console.WriteLine("");
Console.WriteLine("- Retrieving data from: " + serviceURL);
response = client.GetAsync(serviceURL).Result;
Console.WriteLine(" - Response code received: " + response.StatusCode);
try
{
responseContent = response.Content;
strContent = responseContent.ReadAsStringAsync().Result;
if (strContent.Length > 0)
{
Console.WriteLine(" - De-serializing DataSet details...");
// Parse the JSON string into objects and store in DataTable
JavaScriptSerializer js = new JavaScriptSerializer();
js.MaxJsonLength = 2147483647; // Set the maximum json document size to the max
ds = js.Deserialize<PowerBIDataset>(strContent);
if (ds != null)
{
if (ds.value != null)
{
foreach (PowerBIDatasetValue item in ds.value)
{
string datasetID = "";
string datasetName = "";
string datasetWeburl = "";
if (item.id != null)
{
datasetID = item.id;
}
if (item.name != null)
{
datasetName = item.name;
}
if (item.qnaEmbedURL != null)
{
datasetWeburl = item.qnaEmbedURL;
}
// Output the dataset Data
Console.WriteLine("");
Console.WriteLine("----------------------------------------------------------------------------------");
Console.WriteLine("");
Console.WriteLine("Dataset ID: " + datasetID);
Console.WriteLine("Dataset Name: " + datasetName);
Console.WriteLine("Dataset Web Url: " + datasetWeburl);
} // foreach
} // ds.value
} // ds
}
else
{
Console.WriteLine(" - No content received.");
}
}
catch (Exception ex)
{
Console.WriteLine(" - API Access Error: " + ex.ToString());
}
}
points to remember:
Make sure these classes exist in your project
PowerBIDataset is a class with List
PowerBIDatasetValue is a class with id, name and webUrl (all string data type) data members
provide below constants in your project class
const string ApplicationID = "747d78cd-xxxx-xxxx-xxxx-xxxx";
// Native Azure AD App ClientID -- Put your Client ID here
const string UserName = "user2#xxxxxxxxxxxx.onmicrosoft.com";
// Put your Active Directory / Power BI Username here (note this is not a secure place to store this!)
const string Password = "xyxxyx";
// Put your Active Directory / Power BI Password here (note this is not secure pace to store this! this is a sample only)
call this GetDatasetDetails() method in the Main method of your project class
and finally
use the below 'Get' API to get the Group Id
https://api.powerbi.com/v1.0/myorg/groups

Aspose.Barcode cannot read DecodeType.Code128 barcode

The aspose.barcode reader is unable to read the barcode of type DecodeType.Code128
Workflow Steps
1>Using Aspose.Barcode we have created a barcode using DecodeType.Code128 and put on PDF page ( our clients use this page as separator sheet)
2>Our client then insert this barcode page between several physical documents and scanned them all, which creates big single PDF
3>Our splitting process then, loop through all pages and check if any page is barcode page, and splits the big PDF into individual small PDF
Issue is some times the scanned quality of the barcode is not that great, and in such case ASPOSE.Barcode unable to read the barcode.
I have attached couple of barcode PDF with low scanned quality, and aspose is not able to read these barcodes. I have tried different combinations of RecognitionMode and ManualHints options without any luck
Below is my code to identity barcode page
using (var fs = new FileStream(file, FileMode.Open))
{
var pdfDocument = new Document(fs);
foreach (Page page in pdfDocument.Pages)
{
var isSeparator = splitter.IsSeparator(page);
Assert.IsTrue(isSeparator);
}
}
public bool IsSeparator(Page page)
{
if (page.Resources.Images != null && page.Resources.Images.Count >= 1)
{
var img = page.Resources.Images[1];
using (MemoryStream barcodeImage = new MemoryStream())
{
img.Save(barcodeImage, ImageFormat.Jpeg);
barcodeImage.Seek(0L, SeekOrigin.Begin);
using (BarCodeReader barcodeReader = new BarCodeReader(barcodeImage, _barcodeDecodeType))
{
barcodeReader.RecognitionMode = RecognitionMode.MaxQuality;
while (barcodeReader.Read())
{
var barcodeText = barcodeReader.GetCodeText();
if (barcodeText.ToLower() == "eof")
{
return true;
}
}
}
}
}
return false;
}
Unable to reproduce the issue at my end. I used the following sample code snippet to recognize the barcode along with latest version of the API. It is always recommended to use the latest version of the API as it contains new features and improvements.
CODE:
Aspose.Pdf.License licensePdf = new Aspose.Pdf.License();
licensePdf.SetLicense(#"Aspose.Total.lic");
// bind the pdf document
Aspose.Pdf.Facades.PdfExtractor pdfExtractor = new Aspose.Pdf.Facades.PdfExtractor();
pdfExtractor.BindPdf(#"173483_2.pdf");
// extract the images
pdfExtractor.ExtractImage();
// save images to stream in a loop
while (pdfExtractor.HasNextImage())
{
// save image to stream
System.IO.MemoryStream imageStream = new System.IO.MemoryStream();
pdfExtractor.GetNextImage(imageStream);
imageStream.Position = 0;
Aspose.BarCode.BarCodeRecognition.BarCodeReader barcodeReader =
new Aspose.BarCode.BarCodeRecognition.BarCodeReader(imageStream);
while (barcodeReader.Read())
{
Console.WriteLine("Codetext found: " + barcodeReader.GetCodeText() + ", Symbology: " + barcodeReader.GetCodeType().ToString());
}
// close the reader
barcodeReader.Close();
}
Further to update you that the same query has been post on Aspose.BarCode support forum. You may please visit the link for details.
I work as developer evangelist at Aspose.

How to get the large picture from feed with graph api?

When loading the Facebook feeds from one page, if a picture exist in the feed, I want to display the large picture.
How can I get with the graph API ? The picture link in the feed is not the large one.
Thanks.
The Graph API photo object has a picture connection (similar to that the user object has):
“The album-sized view of the photo. […] Returns: HTTP 302 redirect to the URL of the picture.”
So requesting https://graph.facebook.com/{object-id-from-feed}/picture will redirect you to the album-sized version of the photo immediately. (Usefull not only for displaying it in a browser, but also if f.e. you want to download the image to your server, using cURL with follow_redirect option set.)
Edit:
Beginning with API v2.3, the /picture edge for feed posts is deprecated.
However, as a field the picture can still be requested – but it will be a small one.
But full_picture is available as well.
So /{object-id-from-feed}?fields=picture,full_picture can be used to request those, or they can be requested directly with the rest of feed data, like this /page-id/feed?fields=picture,full_picture,… (additional fields, such as message etc., must be specified the same way.)
What worked for me :
getting the picture link from the feed and replacing "_s.jpg" with "_n.jpg"
OK, I found a better way. When you retrieve a feed with the graph API, any feed item with a type of photo will have a field called object_id, which is not there for plain status type items. Query the Graph API with that ID, e.g. https://graph.facebook.com/1234567890. Note that the object ID isn't an underscore-separated value like the main ID of that feed item is.
The result of the object_id query will be a new JSON dictionary, where you will have a source attribute containing a URL for an image that has so far been big enough for my needs.
There is additionally an images array that contains more image URLs for different sizes of the image, but the sizes there don't seem to be predictable, and don't all actually correspond to the physical dimensions of the image behind that URL.
I still wish there was a way to do this with a single Graph API call, but it doesn't look like there is one.
For high res image links from:
Link posts
Video posts
Photo posts
I use the following:
Note: The reason I give the _s -> _o hack precedence over the object_id/picture approach is because the object_id approach was not returning results for all images.
var picture = result.picture;
if (picture) {
if (result.type === 'photo') {
if (picture.indexOf('_s') !== -1) {
console.log('CONVERTING');
picture = picture.replace(/_s/, '_o');
} else if (result.object_id) {
picture = 'https://graph.facebook.com/' + result.object_id + '/picture?width=9999&height=9999';
}
} else {
var qps = result.picture.split('&');
for (var i = 0; i < qps.length; i++) {
var qp = qps[i];
var matches = qp.match(/(url=|src=)/gi);
if (matches && matches.length > 0) picture = decodeURIComponent(qp.split(matches[0])[1]);
}
}
}
This is a new method to get a big image. it was born after the previews method doesn't works
/**
* return a big url of facebook
* works onky for type PHOTO
* #param picture
* #param is a post type link
* #return url of image
*/
#Transactional
public String getBigImageByFacebookPicture(String pictrue,Boolean link){
if(link && pictrue.contains("url=http")){
String url = pictrue.substring(pictrue.indexOf("url=") + 4);
try {
url = java.net.URLDecoder.decode(url, "UTF-8");
} catch (UnsupportedEncodingException e) {
StringBuffer sb = new StringBuffer("Big image for Facebook link not found: ");
sb.append(link);
loggerTakePost.error(sb.toString());
return null;
}
return url;
}else{
try {
Document doc = Jsoup.connect(pictrue).get();
return doc.select("#fbPhotoImage").get(0).attr("src");
} catch (Exception e) {
StringBuffer sb = new StringBuffer("Big image for Facebook link not found: ");
sb.append(link);
loggerTakePost.error(sb.toString());
return null;
}
}
}
Enjoy your large image :)
Actually, you need two different solutions to fully fix this.
1] https://graph.facebook.com/{object_id}/picture
This solution works fine for images and videos posted to Facebook, but sadly, it returns small images in case the original image file was not uploaded to Facebook directly. (When posting a link to another site on your page for example).
2] The Facebook Graph API provides a way to get the full images in the feed itself for those external links. If you add 'full_picture' to the fields like in this example below when calling the API, you will be provided a link to the higher resolution version.
https://graph.facebook.com/your_facebook_id/posts?fields=id,link,full_picture,description,name&access_token=123456
Combining these two solutions I ended up filtering the input in PHP as follows:
if ( isset( $post['object_id'] ) ){
$image_url = 'https://graph.facebook.com/'.$post['object_id'].'/picture';
}else if ( isset( $post['full_picture'] ) ) {
$image_url = $post['full_picture'];
}else{
$image_url = '';
}
See: http://api-portal.anypoint.mulesoft.com/facebook/api/facebook-graph-api/docs/reference/pictures
Just put "?type=large" after the URL to get the big picture.
Thanks to #mattdlockyer for the JS solution. Here is a similar thing in PHP:
$posts = $facebook->api('/[page]/posts/', 'get');
foreach($posts['data'] as $post)
{
if(stristr(#$post['picture'], '_s.'))
{
$post['picture'] = str_replace('_s.', '_n.', #$post['picture']);
}
if(stristr(#$post['picture'], 'url='))
{
parse_str($post['picture'], $picturearr);
if($picturearr['url'])
$post['picture'] = $picturearr['url'];
}
//do more stuff with $post and $post['picture'] ...
}
After positive comment from #Lachezar Todorov I decided to post my current approach (including paging and using Json.NET ;):
try
{
FacebookClient fbClient = new FacebookClient(HttpContext.Current.Session[SessionFacebookAccessToken].ToString());
JObject posts = JObject.Parse(fbClient.Get(String.Format("/{0}/posts?fields=message,picture,link,attachments", FacebookPageId)).ToString());
JArray newsItems = (JArray)posts["data"];
List<NewsItem> result = new List<NewsItem>();
while (newsItems.Count > 0)
{
result.AddRange(GetItemsFromJsonData(newsItems));
if (result.Count > MaxNewsItems)
{
result.RemoveRange(MaxNewsItems, result.Count - MaxNewsItems);
break;
}
JToken paging = posts["paging"];
if (paging != null)
{
if (paging["next"] != null)
{
posts = JObject.Parse(fbClient.Get(paging.Value<String>("next")).ToString());
newsItems = (JArray)posts["data"];
}
}
}
return result;
}
And the helper method to retieve individual items:
private static IEnumerable<NewsItem> GetItemsFromJsonData(IEnumerable<JToken> items)
{
List<NewsItem> newsItems = new List<NewsItem>();
foreach (JToken item in items.Where(item => item["message"] != null))
{
NewsItem ni = new NewsItem
{
Message = item.Value<String>("message"),
DateTimeCreation = item.Value<DateTime?>("created_time"),
Link = item.Value<String>("link"),
Thumbnail = item.Value<String>("picture"),
// http://stackoverflow.com/questions/28319242/simplify-looking-up-nested-json-values-with-json-net/28359155#28359155
Image = (String)item.SelectToken("attachments.data[0].media.image.src") ?? (String)item.SelectToken("attachments.data[0].subattachments.data[0].media.image.src")
};
newsItems.Add(ni);
}
return newsItems;
}
NewsItem class I use:
public class NewsItem
{
public String Message { get; set; }
public DateTime? DateTimeCreation { get; set; }
public String Link { get; set; }
public String Thumbnail { get; set; }
public String Image { get; set; }
}

Download file from web and put it in Stores of TestComplete

I am using TestComplete 7. I am writing a test that download the file from web and puts the downloaded file in Stores. I am using C++ script for achieving this. But I am having problem. I don't know how to download file from web using its URL in C++ Script. Can somebody give me any suggestion
function Test(){
// Specify the names of the source and destination files
var strFileURL = "http://www.automatedqa.com/file to get";
var strHDLocation = "c:\\temp\\filename";
// Download the file
var objHTTP = new ActiveXObject("MSXML2.XMLHTTP");
objHTTP.open("GET", strFileURL, false);
objHTTP.send();
while((objHTTP.readyState != 4) && (objHTTP.readyState != 'complete')) {
Delay(100);
}
if (200 != objHTTP.Status) {
Log.Error("The " + strFileURL + " file was not found." + " The returned status is " + objHTTP.Status);
return;
}
var objADOStream = new ActiveXObject("ADODB.Stream");
objADOStream.Open();
objADOStream.Type = 1; //adTypeBinary
objADOStream.Write(objHTTP.ResponseBody);
objADOStream.Position = 0; //Set the stream position to the start
var objFSO = new ActiveXObject("Scripting.FileSystemObject");
if (objFSO.FileExists(strHDLocation)) objFSO.DeleteFile(strHDLocation)
objADOStream.SaveToFile(strHDLocation);
objADOStream.Close();
Files.Add(strHDLocation);
}