Setting a cover with MPMediaItemArtwork not working iOS 16 - swiftui

I am building an app with AVKit and MediaPlayer and want to show a cover in the control center. Using this code didn't worked for me:
if let image = UIImage(named: "myCover") {
nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size) { size in
return image
}
}
First, I thought that downloading the UI Image failed, but then i realised that this issue also happend when using a locally saved image.
By the way: Setting up the player (pause/play/background mode) worked fine.
I am using iOS 16 and Xcode 14.0.1.

where is your nowPlayingInfo coming from ?
In my process I declare a new dictionary with titles and stuff
then I process my image later with async download using
MPNowPlayingInfoCenter.default().nowPlayingInfo?[MPMediaItemPropertyArtwork] = artwork
I mean if you directly set it in the default Center is it still not showing ?

Related

Dynamic theme color at run time jetpack compose

I'm new to Jetpack Compose, so I'm struggling to implement a feature which is dynamic colors (and font, size,... but I think they are the same so I'll just focus on color) at run time from backend. I'll let the app the some default colors, and a whole default splash screen just to load the colors setting from the backend. In case the API request failed, it would use the last succeeded requested colors or just the default color.
Tutorials I found round the internet was just about changing the dark/light theme at run time, not changing a specific color in the color pack. In those tutorials, the color is defined in Colors.kt file which is not a composable or class or object, ...
I imagine the color within lightColors or darkColors would be something like this.
return lightColors(
primary = Color(android.graphics.Color.parseColor("#" + dynamicColorMap["One"])),
...
}
And when dynamicColorMap changes in the splashscreen, all screen later will have reference to the new value, but I don't know how to update its variable outside of a composable.
I thought of using DB to store the colors, but getting the data from DB is async, so it cannot be query in the default Colors.kt like var colorOne = DBManager.getColor("One"), I can run the async task in my splash screen before changing to the next screen but then the problem again is how to have a global state that my theme composable wrapper can have access to on every screen?
I just don't know where to start for these case.
Thank you for your time
EDIT:
I currently having the project structured in MVVM. For now, only one activity (MainActivity) is present, and inside that activity, the splash screen component or home screen or login screen,... are being navigated. So is it a good practice to create a viewmodel for the mainactivity screen, that can holds the color state for the theme?
Thanks #Maciej Ciemiega for the suggestion. I ended up structure my code like that.
In my MainActivity.kt I create a viewmodel for it.
val mainActivityViewModel by viewModels<MainActivityViewModel>()
MyTheme(mainActivityViewModel = mainActivityViewModel) {
initNavigationController(navController)
Surface(color = MaterialTheme.colors.background) {
if (mainActivityViewModel.appSettingsState.value.appSettings.colorsMapLight.size != 0
&& mainActivityViewModel.appSettingsState.value.appSettings.colorsMapDark.size != 0) {
navController.navigate(NavigationDestinations.homeScreen)
}
}
}
my initNavigationController function shows the splashscreen first. But it doesn't do anything. The getting app settings configuration is called in MyTheme composable via the mainActivityViewModel, and MyTheme will use the state from the viewmodel to define the theme, and the navController.navigate is based on the state as you guys can see in the if above.
I don't know if this is a good practice or not, or when my app grows it would be a mess or not, but at least it works for me. I tried with font styles too and it works like a charm.

Cocos2d-x tilmap black screen

I am a beginner with cocos2d-x and try to use it with Tiled to create maps.
I created a TileMap, and here is my code, in LevelOne::init() in level_one.cpp :
if (!CCLayer::init())
{
return false;
}
_tileMap = new CCTMXTiledMap();
_tileMap->initWithTMXFile("levelone.tmx");
this->addChild(_tileMap);
return true;
The debugger allows me to see that the variable _tileMap contains well (at least a part) of the information in my levelone.tmx file.
But when i run it, got a black screen.
Here is the project on github : https://github.com/LeopoldBriand-bot/Platformer
what would I have misunderstood?
Thanks.
I figure it out: Tiles folder should be on th same disk than .tmx file to have relative path and both should be in Ressource folder.

Dragging an image from Chrome to a Qt application

I am trying to get the data from an image dragged directly from a browser to my Qt app. I have the following code:
void MyView::dropEvent( QDropEvent* event )
{
QGraphicsView::dropEvent( event );
if ( event->mimeData()->hasImage() )
{
QImage image = qvariant_cast<QImage>( event->mimeData()->imageData() );
...
This works fine with Firefox (Windows/Mac), Safari (Mac) and IE (Windows). But QMimeData::hasUrl() returns false for Chrome on both Windows and Mac.
On further investigation Qt expects image data in MIME format "application/x-qt-image". It seems that Chrome doesn't provide this. Is there a workaround for Chrome? I haven't been able to find anything.

Google Glass preview image scrambled with new XE10 release

This occurs using a few apks that make use of the camera (e.g., zxing, opencv). It displays a glitched image in the preview but it is still a function of what the camera sees so it appears to be an encoding mismatch. The native camera preview works fine, so the internal apps do not exhibit this problem.
For now, please try adding the following workaround after you acquire the Camera but before you setup and start the preview:
Camera.Parameters params = camera.getParameters();
params.setPreviewFpsRange(30000, 30000);
camera.setParameters(params);
(Or just add the setPreviewFpsRange call to your existing parameters if you're setting others as well.)
For anyone using ZXing on their Glass, you can build a version from the source code with the above fix.
Add the following method into CameraConfigurationManager.java
public void googleGlassXE10WorkAround(Camera mCamera) {
Camera.Parameters params = mCamera.getParameters();
params.setPreviewFpsRange(30000, 30000);
params.setPreviewSize(640,360);
mCamera.setParameters(params);
}
And call this method immediately after anywhere you see Camera.setParameters() in the ZXing code. I just put it in two places in the CameraConfigurationManager and it worked.
I set the Preview Size to be 640x360 to match the Glass resolution.
30 FPS preview is pretty high. If you want to save some battery and CPU, consider the slowest supported FPS to be sufficient:
List<int[]> supportedPreviewFpsRanges = parameters.getSupportedPreviewFpsRange();
int[] minimumPreviewFpsRange = supportedPreviewFpsRanges.get(0);
parameters.setPreviewFpsRange(minimumPreviewFpsRange[0], minimumPreviewFpsRange[1]);
The bug still exists as of XE16 and XE16.11 but this code gets past the glitch and shows a normal camera preview, note the three parameter setting lines and their values. I have also tested this at 5000 (5FPS) and it works, and at 60000 (60FPS) and it does not work:
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
if (mCamera == null) return;
Camera.Parameters camParameters = mCamera.getParameters();
camParameters.setPreviewFpsRange(30000, 30000);
camParameters.setPreviewSize(1920, 1080);
camParameters.setPictureSize(2592, 1944);
mCamera.setParameters(camParameters);
try {
mCamera.startPreview();
} catch (Exception e) {
mCamera.release();
mCamera = null;
}
}
This still is an issue as of XE22 (!) Lowering the frames per second to 30 or lower does the trick:
parameters.setPreviewFpsRange(30000, 30000);
And indeed, don't forget to set the parameters:
camera.setParameters(parameters);
I have found no clear explanation as to why this causes trouble, since 60 fps is included in the supported fps range. The video can record 720p, but I never saw a source add the fps to this.
You can set the params.setPreviewSize(1200,800). You can change the values around this range until you can clear color noise.

Setting icons in Unity3d standalone player

I am trying to use Unity, with a build script that creates my application by ultimately invoking BuildPipeline from the script. I am trying to figure out how to set up the icons, however. After calling PlayerSettings.SetIconsForTargetGroup and invoking BuildPipline.BuildPlayer, the appropriate icon does not show up for the executable file produced, nor display when the program is running.
I am currently using the following code.
Texture2D texture = AssetDatabase.LoadMainAssetAtPath(iconFile) as Texture2D;
int [] sizeList = PlayerSettings.GetIconSizesForTargetGroup(BuildTargetGroup.Standalone);
Texture2D[] iconList = new Texture2D[sizeList.Length];
for(int i=0;i<sizeList.Length;i++)
{
int iconSize = sizeList[i];
iconList[i] = (Texture2D)Instantiate(texture);
iconList[i].Resize(iconSize,iconSize,TextureFormat.ARGB32,false);
}
PlayerSettings.SetIconsForTargetGroup(BuildTargetGroup.Standalone,iconList);
What am I doing wrong?
Any assistance would be greatly appreciated. Thank you
Using BuildTargetGroup.Unknown works for me.
This also worked for me:
BuildTargetGroup.Unknown
But with an unwanted outcome: Unity does not override all icon sizes when setting one single Texture2D PNG, so Windows Explorer swaps the icon of my application EXE between the Unity default icon and my icon for different resolutions. Definitely, seems like a bug in Unity3D BuildPipeline.BuildPlayer(..) or SetIconsForTargetGroup(..).
If you're calling BuildPipline.BuildPlayer for OSX, then you need to specify full folder:
/MyPath/MyApp.app
i.e. don't leave out the ".app" part, because without that Unity builds the app, but doesn't include icons or splash image in resolution dialog.