I have a game in cocos2D v3.00. Everything was ok until I add the arm64 architecture. Now, on 64 devices, each time I have a CCButton callback method the sender is a UITouch and not a CCButton like in 32 bit.
[EDIT]
My code :
-(void)addButton
{
CCButton *myButton = [CCButton buttonWithTitle:#"[ Pause ]" fontName:#"Jet Set.ttf" fontSize:18.0f];
myButton.positionType = CCPositionTypeNormalized;
myButton.position = ccp(0.5f, 0.95f);
myButton.name = #"testbutton";
[myButton setTarget:self selector:#selector(myButtonPressed:)];
[self addChild:myButton];
}
-(void)myButtonPressed:(CCButton*)sender
{
NSLog(#"%s = %#", "sender", sender);
}
// 32Bits => sender = <CCButton = 0x17e0c180 | Name = button_1>
// 64Bits => sender = <UITouch: 0x1743841d0>
Is there a way to solve it ?
Thanks.
Related
Im trying to rewrite this piece of code to Vala:
gstreamer example
I got stuck at this line:
watch_id = gst_bus_add_watch (bus, message_handler, NULL);
My vala equivalent:
var watch_id = bus.add_watch (Priority.DEFAULT, message_handler);
I haven't got a clue how to format the BusFunc and it's supposed arguments
BusFunc
Complete code so far:
using Gst;
bool Gst.BusFunc message_handler ()
{
return false;
}
void main (string[] args) {
// Initializing GStreamer
Gst.init (ref args);
var caps = Caps.from_string("audio/x-raw,channels=2");
// Creating pipeline and elements
var pipeline = new Pipeline ("my_pipeline");
var bin = new Bin ("my_bin");
var bus = new Bus ();
var src = ElementFactory.make ("autoaudiosrc", "my_src");
var sink = ElementFactory.make ("autoaudiosink", "my_sink");
var convert = ElementFactory.make ("audioconvert", "my_convert");
var level = ElementFactory.make ("level", "my_level");
var fakesink = ElementFactory.make ("fakesink", "my_fakesink");
// Adding elements to pipeline
//pipeline.add_many (src, sink);
bin.add_many (pipeline, src, convert, level, fakesink);
src.link(convert);
convert.link_filtered (level, caps);
level.link(fakesink);
level.set ("post-messages", true);
fakesink.set ("sync", true);
bus = pipeline.get_bus ();
var watch_id = bus.add_watch (Priority.DEFAULT, message_handler);
// Linking source to sink
src.link (sink);
// Set pipeline state to PLAYING
pipeline.set_state (State.PLAYING);
Thanks in advance!
You're almost there. A delegate identifies the function signature: its parameter types and return type. The BusFunc type has the signature: public delegate bool BusFunc (Bus bus, Message message) so your handler will be something like:
bool message_handler (Bus my_bus, Message my_message)
{
print (#"Message type: $(my_message.type.get_name ())\n");
return true;
}
It returns true in this example to keep the handler.
This example is not tested, but should give you the right idea to move forward.
I need some help. I'm programming a Win 10 App in C++/CX. I am using two USB to RS485 devices, both of which have the same VID number. In days of old, I could write a bit software and connect to ports using good old COMx etc.
I'm now following the example here Serial Sample which uses the approach gathering the device info so when looking for connected devices, what I see in the list of available devices is the following.
\?\FTDIBUS#VID_0403+PID_6001
Both devices have the same VID and PID. This leads to the problem of me being cable to connect to the correct USB device. I think my app is trying to connect to both devices at the same time? Does anyone have any ideas about how I can resolve this hitch?
void MainPage::Get_Serial_Devices() {
cancellationTokenSource_Port1 = new Concurrency::cancellation_token_source();
cancellationTokenSource_Port2 = new Concurrency::cancellation_token_source();
// THIS USES ASYNCRONOUS OPERATION. GET A LIST OF SERIAL DEVICES AND POPULATE THE COMBO BOX
Concurrency::create_task(ListAvailablePortsAsync()).then([this](DeviceInformationCollection^ serialDeviceCollectioin)
{
// serialDeviceCollection CONTAINS ALL SERIAL DEVICES FOUND, COPY INTO _deviceCollection
DeviceInformationCollection^ _deviceCollection = serialDeviceCollectioin;
// CLEAR EXISTING DEVICES FOR OUR OBJECT COLLECTION
_availableDevices->Clear();
// FOR EVERY DEVICE IN _deviceCollection
for (auto &&device : _deviceCollection) {
if (device->Name->Equals("USB-RS485 Cable")) {
// CREATE A NEW DEVICE TYPE AND APPEND TO OUR OBJECT COLLECTION
_availableDevices->Append(ref new Device(device->Id, device));
Total_Ports++;
this->DeviceLists->Items->Append(device->Id);
}
}
});
void MainPage::ConnectButton_Click(Object^ sender, RoutedEventArgs^ e) {
if (Port1_Connected == false) {
// CAST INDEX TO CORRELATING Device IN _availableDevices
Device^ selectedDevice = static_cast<Device^>(_availableDevices->GetAt(Port_1_ID));
// GET THE DEVICE INFO
DeviceInformation^ entry = selectedDevice->DeviceInfo;
Concurrency::create_task(ConnectToSerialDeviceAsync_Port1(entry, cancellationTokenSource_Port1->get_token())).then([this]( ) {
Get_Echo();
Waiting_For_Ack = true;
});
}
Concurrency::task<void> MainPage::ConnectToSerialDeviceAsync_Port1(DeviceInformation^ device, Concurrency::cancellation_token cancellationToken) {
// CREATE A LINKED TOKEN WHICH IS CANCELLED WHEN THE PROVIDED TOKEN IS CANCELLED
auto childTokenSource = Concurrency::cancellation_token_source::create_linked_source(cancellationToken);
// GET THE TOKEN
auto childToken = childTokenSource.get_token();
// CONNECT TO ARDUINO TASK
return Concurrency::create_task(SerialDevice::FromIdAsync(device->Id), childToken).then([this](SerialDevice^ serial_device) {
try {
_serialPort_Port1 = serial_device;
TimeSpan _timeOut; _timeOut.Duration = 10;
// CONFIGURE SERIAL PORT SETTINGS
_serialPort_Port1->WriteTimeout = _timeOut;
_serialPort_Port1->ReadTimeout = _timeOut;
_serialPort_Port1->BaudRate = 57600;
_serialPort_Port1->Parity = Windows::Devices::SerialCommunication::SerialParity::None;
_serialPort_Port1->StopBits = Windows::Devices::SerialCommunication::SerialStopBitCount::One;
_serialPort_Port1->DataBits = 8;
_serialPort_Port1->Handshake = Windows::Devices::SerialCommunication::SerialHandshake::None;
// CREATE OUR DATA READER OBJECT
_dataReaderObject_Port1 = ref new DataReader(_serialPort_Port1->InputStream);
_dataReaderObject_Port1->InputStreamOptions = InputStreamOptions::None;
// CREATE OUR DATA WRITE OBJECT
_dataWriterObject_Port1 = ref new DataWriter(_serialPort_Port1->OutputStream);
this->ConnectButton->IsEnabled = false;
this->DisconnectButton->IsEnabled = true;
// KICK OF THE SERIAL PORT LISTENING PROCESS
Listen_Port1();
}
catch (Platform::Exception^ ex) {
this->Error_Window->Text = (ex->Message);
CloseDevice(PORT_1);
}
});
FT_PROG is a free EEPROM programming utility for use with FTDI devices. It is used for modifying EEPROM contents that store the FTDI device descriptors to customize designs.
The full FT_PROG User Guide can be downloaded here.
I'm having trouble getting AVAudioEngine (OS X) to play nice with all sample rates.
Here's my code for building the connections:
- (void)makeAudioConnections {
auto hardwareFormat = [self.audioEngine.outputNode outputFormatForBus:0];
auto format = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:hardwareFormat.sampleRate channels:2];
NSLog(#"format: %#", format);
#try {
[self.audioEngine connect:self.avNode to:self.audioEngine.mainMixerNode format:format];
[self.audioEngine connect:self.audioEngine.inputNode to:self.avNode format:format];
} #catch(NSException* e) {
NSLog(#"exception: %#", e);
}
}
On my audio interface, the render callback is called for 44.1, 48, and 176.4kHz. It is not called for 96 and 192 kHz. On the built-in audio, the callback is called for 44.1, 48, 88 but not 96.
My AU's allocateRenderResourcesAndReturnError is being called for 96kHz. No errors are returned.
- (BOOL) allocateRenderResourcesAndReturnError:(NSError * _Nullable *)outError {
if(![super allocateRenderResourcesAndReturnError:outError]) {
return NO;
}
_inputBus.allocateRenderResources(self.maximumFramesToRender);
_sampleRate = _inputBus.bus.format.sampleRate;
return YES;
}
Here's my AU's init method, which is mostly just cut & paste from Apple's AUv3 demo:
- (instancetype)initWithComponentDescription:(AudioComponentDescription)componentDescription options:(AudioComponentInstantiationOptions)options error:(NSError **)outError {
self = [super initWithComponentDescription:componentDescription options:options error:outError];
if (self == nil) {
return nil;
}
// Initialize a default format for the busses.
AVAudioFormat *defaultFormat = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100. channels:2];
// Create the input and output busses.
_inputBus.init(defaultFormat, 8);
_outputBus = [[AUAudioUnitBus alloc] initWithFormat:defaultFormat error:nil];
// Create the input and output bus arrays.
_inputBusArray = [[AUAudioUnitBusArray alloc] initWithAudioUnit:self busType:AUAudioUnitBusTypeInput busses: #[_inputBus.bus]];
_outputBusArray = [[AUAudioUnitBusArray alloc] initWithAudioUnit:self busType:AUAudioUnitBusTypeOutput busses: #[_outputBus]];
self.maximumFramesToRender = 256;
return self;
}
To keep things simple, I'm setting the sample rate before starting the app.
I'm not sure where to begin tracking this down.
Update
Here's a small project which reproduces the issue I'm having:
Xcode project to reproduce issue
You'll get errors pulling from the input at certain sample rates.
On my built-in audio running at 96kHz the render block is called with alternating 511 and 513 frame counts and errors -10863 (kAudioUnitErr_CannotDoInCurrentContext) and -10874 (kAudioUnitErr_TooManyFramesToProcess) respectively. Increasing maximumFramesToRender doesn't seem to help.
Update 2
I simplified my test down to just connecting the input to the main mixer:
[self.audioEngine connect:self.audioEngine.inputNode to:self.audioEngine.mainMixerNode format:nil];
I tried explicitly setting the format argument.
This still will not play through at 96kHz. So I'm thinking this may be a bug in AVAudioEngine.
For play-through with AVAudioEngine, the input and output hardware formats and all the connection formats must be at the same sample rate. So the following should work.
AVAudioFormat *outputHWFormat = [self.audioEngine.outputNode outputFormatForBus:0];
AVAudioFormat *inputHWFormat = [self.audioEngine.inputNode inputFormatForBus:0];
if (inputHWFormat.sampleRate == outputHWFormat.sampleRate) {
[self.audioEngine connect:self.audioEngine.inputNode to:self.audioEngine.mainMixerNode format:inputHWFormat];
[self.audioEngine connect:self.audioEngine.mainMixerNode to:self.audioEngine.outputNode format:inputHWFormat];
}
I Want Random Local notification once every 24 hours.
I know , i Can have Daily Local Notification Using this :
UILocalNotification *localNotification = [[UILocalNotification alloc] init];
localNotification.fireDate = fireDate;
localNotification.timeZone = [NSTimeZone defaultTimeZone];
localNotification.repeatInterval = NSDayCalendarUnit;
localNotification.alertBody = alertText;
localNotification.alertAction = alertAction;
// Schedule it with the app
[[UIApplication sharedApplication] scheduleLocalNotification:localNotification];
[localNotification release];
But ,With This I can Have Same Time Every Day ,But How can I have a Random Time EveryDay.
Please Help !
Even, Is this Possible ?
According to UILocalNotification class reference you could schedule up to 64 local notifications to fire at exact time. It is enough to cover a couple months of random timed notifications since every app launch. Here is a sample:
- (void)scheduleLocalNotifications
{
[[UIApplication sharedApplication] cancelAllLocalNotifications];
static NSInteger dayInSeconds = 60*60*24;
NSInteger now = (NSInteger)[NSDate timeIntervalSinceReferenceDate];
NSInteger tomorrowStart = now - now % dayInSeconds + dayInSeconds;
for (int q=0; q<64; ++q)
{
NSInteger notificationTime = tomorrowStart + q*dayInSeconds + rand()%dayInSeconds;
NSDate * notificationDate = [NSDate dateWithTimeIntervalSinceReferenceDate:notificationTime];
NSLog(#"date %#", notificationDate);
UILocalNotification * notification = [UILocalNotification new];
notification.fireDate = notificationDate;
notification.timeZone = [NSTimeZone timeZoneForSecondsFromGMT:0];
notification.soundName = UILocalNotificationDefaultSoundName;
notification.alertBody = #"Hello!";
[[UIApplication sharedApplication] scheduleLocalNotification:notification];
}
}
For anybody looking to do this in Swift you can do something like this:
func scheduleNotfications() {
print("Scheduling reminder notifications")
UIApplication.sharedApplication().cancelAllLocalNotifications()
let windowInSeconds: UInt32 = 60*60*5
let oneDayInSeconds: Double = 60*60*24
let windowAroundHour = 14
let calendar = NSCalendar(calendarIdentifier: NSCalendarIdentifierGregorian)
calendar?.timeZone = NSTimeZone.localTimeZone()
if let todayAtWindowHour = calendar?.dateBySettingHour(windowAroundHour, minute: 0, second: 0, ofDate: NSDate(), options: .MatchFirst)?.timeIntervalSinceReferenceDate {
for index in 1...48 {
var notificationDate = ( todayAtWindowHour + ( Double(index) * oneDayInSeconds ) )
// either remove or add anything up to the window
if arc4random_uniform(2) == 0 {
notificationDate = notificationDate + Double(arc4random_uniform(windowInSeconds))
} else {
notificationDate = notificationDate - Double(arc4random_uniform(windowInSeconds))
}
let fireDate = NSDate(timeIntervalSinceReferenceDate: notificationDate)
let notification = UILocalNotification()
notification.alertBody = NSLocalizedString("You've received a new notification.", comment: "Notification")
notification.fireDate = fireDate
notification.timeZone = NSTimeZone.defaultTimeZone()
notification.applicationIconBadgeNumber = 1
UIApplication.sharedApplication().scheduleLocalNotification(notification)
print("Set for: \(fireDate)")
}
}
print("Finished scheduling reminder notifications")
}
The above will schedule random notifications for the next 48 days, making sure those notifications only fire at a reasonable time of the day.
You can call this from applicationDidBecomeActive()
func applicationDidBecomeActive(application: UIApplication) {
UIApplication.sharedApplication().applicationIconBadgeNumber = 0
scheduleNotfications()
}
Clean answer
This schedules a notification for the next 64 days. Good place to set it is in didFinishLaunchingWithOptions: because it cancelAllLocalNotifications and set 64 notifications for the future. So every time the user opens the app it clears and reschedule for the next 64 days from now.
[[UIApplication sharedApplication] cancelAllLocalNotifications];
NSDate *givenDate = [NSDate date]; // set your start date here
NSCalendar *calendar = [NSCalendar currentCalendar];
[calendar setTimeZone:[NSTimeZone localTimeZone]];
NSDateComponents *components = [calendar components:(NSCalendarUnitYear | NSCalendarUnitMonth | NSCalendarUnitDay | NSCalendarUnitHour | NSCalendarUnitMinute) fromDate:givenDate];
NSDateComponents *dateComps = [NSDateComponents new];
[dateComps setYear:components.year];
[dateComps setMonth:components.month];
[dateComps setDay:components.day];
[dateComps setHour:components.hour];
[dateComps setMinute:components.minute];
[dateComps setSecond:0];
NSDate *notificationDate = [calendar dateFromComponents:dateComps];
for (int x = 0; x < 64; x++) {
UILocalNotification *notification = [UILocalNotification new];
[notification setFireDate:notificationDate];
[notification setTimeZone:[NSTimeZone localTimeZone]];
[notification setSoundName:UILocalNotificationDefaultSoundName];
[notification setAlertBody:#"My notification body!"];
[[UIApplication sharedApplication] scheduleLocalNotification:notification];
notificationDate = [NSDate dateWithTimeInterval:86400 sinceDate:notificationDate];
}
Output:
...
[26] Date : 2016-06-29 17:11:00 +0000
[27] Date : 2016-06-30 17:11:00 +0000
[28] Date : 2016-07-01 17:11:00 +0000
[29] Date : 2016-07-02 17:11:00 +0000
...
I am trying to implement a screenshot functionality in a WinRT app that shows Video via a MediaElement. I have the following code, it saves a screenshot that's the size of the MediaElement but the image is empty (completely black). Tried with various types of Media files. If I do a Win Key + Vol Down on Surface RT, the screen shot includes the Media frame content, but if I use the following code, it's blackness all around :(
private async Task SaveCurrentFrame()
{
RenderTargetBitmap renderTargetBitmap = new RenderTargetBitmap();
await renderTargetBitmap.RenderAsync(Player);
var pixelBuffer = await renderTargetBitmap.GetPixelsAsync();
MultimediaItem currentItem = (MultimediaItem)this.DefaultViewModel["Group"];
StorageFolder currentFolder = Windows.Storage.ApplicationData.Current.LocalFolder;
var saveFile = await currentFolder.CreateFileAsync(currentItem.UniqueId + ".png", CreationCollisionOption.ReplaceExisting);
if (saveFile == null)
return;
// Encode the image to the selected file on disk
using (var fileStream = await saveFile.OpenAsync(FileAccessMode.ReadWrite))
{
var encoder = await BitmapEncoder.CreateAsync(BitmapEncoder.PngEncoderId, fileStream);
encoder.SetPixelData(
BitmapPixelFormat.Bgra8,
BitmapAlphaMode.Ignore,
(uint)renderTargetBitmap.PixelWidth,
(uint)renderTargetBitmap.PixelHeight,
DisplayInformation.GetForCurrentView().LogicalDpi,
DisplayInformation.GetForCurrentView().LogicalDpi,
pixelBuffer.ToArray());
await encoder.FlushAsync();
}
}
Here MultimediaItem is my View Model class that among other things has a UniqueId property that's a string.
'Player' is the name of the Media Element.
Is there anything wrong with the code or this approach is wrong and I've to get in the trenches with C++?
P.S. I am interested in the WinRT API only.
Update 1 Looks like RenderTargetBitmap doesn't support this, the MSDN documentation clarifies it http://msdn.microsoft.com/en-us/library/windows/apps/windows.ui.xaml.media.imaging.rendertargetbitmap .
I'll appreciate any pointers on how to do it using DirectX C++. This is a major task for me so I'll crack this one way or the other and report back with the solution.
Yes, it is possible - little bit tricky, but working well.
You dont use mediaElement, but StorageFile itself.
You need to create writableBitmap with help of Windows.Media.Editing namespace.
Works in UWP (Windows 10)
This is complete example with file picking and getting video resolution and saving image to Picture Library
TimeSpan timeOfFrame = new TimeSpan(0, 0, 1);//one sec
//pick mp4 file
var picker = new Windows.Storage.Pickers.FileOpenPicker();
picker.SuggestedStartLocation = Windows.Storage.Pickers.PickerLocationId.VideosLibrary;
picker.FileTypeFilter.Add(".mp4");
StorageFile pickedFile = await picker.PickSingleFileAsync();
if (pickedFile == null)
{
return;
}
///
//Get video resolution
List<string> encodingPropertiesToRetrieve = new List<string>();
encodingPropertiesToRetrieve.Add("System.Video.FrameHeight");
encodingPropertiesToRetrieve.Add("System.Video.FrameWidth");
IDictionary<string, object> encodingProperties = await pickedFile.Properties.RetrievePropertiesAsync(encodingPropertiesToRetrieve);
uint frameHeight = (uint)encodingProperties["System.Video.FrameHeight"];
uint frameWidth = (uint)encodingProperties["System.Video.FrameWidth"];
///
//Use Windows.Media.Editing to get ImageStream
var clip = await MediaClip.CreateFromFileAsync(pickedFile);
var composition = new MediaComposition();
composition.Clips.Add(clip);
var imageStream = await composition.GetThumbnailAsync(timeOfFrame, (int)frameWidth, (int)frameHeight, VideoFramePrecision.NearestFrame);
///
//generate bitmap
var writableBitmap = new WriteableBitmap((int)frameWidth, (int)frameHeight);
writableBitmap.SetSource(imageStream);
//generate some random name for file in PicturesLibrary
var saveAsTarget = await KnownFolders.PicturesLibrary.CreateFileAsync("IMG" + Guid.NewGuid().ToString().Substring(0, 4) + ".jpg");
//get stream from bitmap
Stream stream = writableBitmap.PixelBuffer.AsStream();
byte[] pixels = new byte[(uint)stream.Length];
await stream.ReadAsync(pixels, 0, pixels.Length);
using (var writeStream = await saveAsTarget.OpenAsync(FileAccessMode.ReadWrite))
{
var encoder = await BitmapEncoder.CreateAsync(BitmapEncoder.JpegEncoderId, writeStream);
encoder.SetPixelData(
BitmapPixelFormat.Bgra8,
BitmapAlphaMode.Premultiplied,
(uint)writableBitmap.PixelWidth,
(uint)writableBitmap.PixelHeight,
96,
96,
pixels);
await encoder.FlushAsync();
using (var outputStream = writeStream.GetOutputStreamAt(0))
{
await outputStream.FlushAsync();
}
}
Yeah...I spent lot of hours by this
Ok I have managed to get making snapshot from MediaElement on button press to work.
I am passing MediaStreamSource object to MediaElement using SetMediaStreamSource method. MediaStreamSource has event SampleRequested which is fired basicly everytime new frame is drawn. Then using boolean I control when to create bitmap
private async void MediaStream_SampleRequested(MediaStreamSource sender, MediaStreamSourceSampleRequestedEventArgs args)
{
if (!takeSnapshot)
{
return;
}
takeSnapshot = false;
Task.Run(() => DecodeAndSaveVideoFrame(args.Request.Sample));
}
After that what is left is to decode compressed image and convert it to WriteableBitmap. The image is (or at least was in my case) in YUV fromat. You can get the byte array using
byte[] yvuArray = sample.Buffer.ToArray();
and then get data from this array and convert it to RGB. Unfortunetly I cannot post entire code but I'm gonna give you a few more hints:
YUV to RGB wiki here you have wiki describing how does YUV to RGB conversion works.
Here I found python project which solution I have adapted (and works perfectly). To be more precise you have to analize how method NV12Converter works.
The last thing is to change takeSnapshot boolean to true after pressing button or doing other activity :).