Label displaying "Optional('#') in Swift when using NumberFormatter - swift3

I have the following code:
let numberFormatter = NumberFormatter()
numberFormatter.numberStyle = NumberFormatter.Style.decimal
numberFormatter.maximumFractionDigits = 2
let formattedNumber = numberFormatter.string(from: NSNumber(value: rawValue))
currentLogBF.text = "\(formattedNumber) BF"
In the above example, rawValue is a Double that is calculated when all of the input fields have values in them.
currentLogBF is a label in my View.
Whenever a calculation is completed, the label displays something like this:
Optional("12,307.01") BF
How do I get rid of the "Optional()" piece, so it just displays this:
12,307.01 BF
Any ideas what I am doing wrong here?

The function numberFormatter.string(from: NSNumber) will return you a String Optional (String?) instead of String.
You will need to unwrap it first like this
if let formattedNumber = numberFormatter.string(from: NSNumber(value: rawValue)) {
currentLogBF.text = "\(formattedNumber) BF"
} else {
Log.warn("Failed to format number!")
}
And as bonus, use String(format: "%# BF", formattedNumber) rather than "\(formattedNumber) BF" when dealing with optional.
String(format:) will give you compile error when you try to pass optional value as an argument

unwrapping an optional value
let formattedNumber:String? = numberFormatter.string(from: NSNumber(value: rawValue))
currentLogBF.text = "\(formattedNumber!) BF" //optional string. This will result in nil while unwrapping an optional value if value is not initialized or if initialized to nil.
currentLogBF.text = "\(formattedNumber) BF" //Optional("optional string") //nil values are handled in this statement

Related

How do I round the Decimal type?

Using Swift3 and still getting the hang of things. I'm using the Decimal type because it involves currency and I'm having a difficult time with getting the rounding to work. I've read through the NSDecimalNumberHandler documentation and the rounding function but don't quite understand how to get this to work. Essentially I just want all my Decimal types in this class to round to the hundredths spot when the calculation functions I've built run.
Can someone give me quick example of how to do this? Thanks!
Please check this :
This is using NSDecimalNumber & NSDecimalNumberHandler :
let decimalStr = NSDecimalNumber(string: "500.2595")
let decimalStrHandler = NSDecimalNumberHandler(roundingMode: .plain, scale: 3, raiseOnExactness: false, raiseOnOverflow: false, raiseOnUnderflow: false, raiseOnDivideByZero: false)
let roundedVal = decimalStr.rounding(accordingToBehavior: decimalStrHandler)
print(roundedVal) // prints 500.26
This is using NumberFormatter & Decimal :
extension Decimal {
func roundDecimal() -> String {
let formatter = NumberFormatter()
formatter.minimumFractionDigits = 2
return formatter.string(from: self as NSDecimalNumber)!
}
}
You have to call like below :
let decimalStr = Decimal(string: "500.2595")!
print(decimalStr.roundDecimal()) // prints 500.26
let decimalFloat = Decimal(floatLiteral: 500.2595)
print(decimalFloat.roundDecimal()) // prints 500.26
You should never save a currency value as decimal number. Always use integer, like this:
1.00$ = 100
4567.89$ = 456789
And then when you want to present it not in cents you can divide by 100.
See this: Why not use Double or Float to represent currency?
There is a specific function called NSDecimalRound which you can use for this. Here is an extension to Decimal which you can use to get standard round and rounded functions:
extension Decimal {
mutating func round(_ scale: Int, _ roundingMode: NSDecimalNumber.RoundingMode) {
var localCopy = self
NSDecimalRound(&self, &localCopy, scale, roundingMode)
}
func rounded(_ scale: Int, _ roundingMode: NSDecimalNumber.RoundingMode) -> Decimal {
var result = Decimal()
var localCopy = self
NSDecimalRound(&result, &localCopy, scale, roundingMode)
return result
}
}

UserDefault values are missing

I want to save some values in UserDefaults. And I am using this code to save
func SaveSettings(){
let def = UserDefaults.standard
def.set("test", forKey: "Value1")
def.set(myString, forKey: "Value2") //value: test1
def.set(myInt, forKey: "Value3") //value: 25
def.set(myInt64, forKey: "Value4") //value: 103254
def.synchronize() //I've tried to remove this line
}
After saving I use this code to control if my values are saved in UserDefaults or not
for (key, value) in UserDefaults.standard.dictionaryRepresentation() {
print("\(key) = \(value) \n")
}
And see this result
{
Value1 = test
Value2 = test1
Value3 = 25
Value4 = 103254
}
There isn't any problem untill now. But After I restart the app and look at the values in UserDefaults I see this result
{
Value1 = test
Value2 =
}
As you see Value3ad Value4 are missing. However Value2 and Value1 stays. but Value2's value is missing
I've found the problem.
SaveSettings() func have called when myInt and myInt64's values are nil and myString's value is empty. Somehow Value3 and Value4 are deleted here. I guess this happen becaues of their value is nil. But I am not sure this is exact reason.
When you run your app, string value which u have set in the Value1 will get saved, but the string objects are not getting any values when the app initially runs. So you need to check myInt, myString, myInt64 are not getting nil before saving into UserDefaults.standard.

How to change only variable color in a print label

I want change a color only for variable in a tuple print like this
result.text = ("\(var1) candy " + " of: \(var2) blablabla")
Now, how can I change color of "var1" and "var2", from the inspector I set red color for the label, but it's impossible change color only for variable.
Thanks a lot
What you have to do is create an attributed string for each color (let's say blue and yellow):
let t : NSAttributedString = NSAttributedString(string: "Hi", attributes: [NSForegroundColorAttributeName : UIColor.blue])
let t2 : NSAttributedString = NSAttributedString(string: " There", attributes: [NSForegroundColorAttributeName : UIColor.yellow])
If you want to join them together:
let final = NSMutableAttributedString(attributedString: t)
final.append(t2)
UPDATE
so in your case, since you want to color two different sections of your resulting string, you'd want to use the NSRange approach. So a method like this:
func colorTheVariables(_ var1Value: String,_ var2Value: String) {
let middle = " candy of: ".characters.count
let value = "\(var1Value) candy of : \(var2Value)"
let text = NSMutableAttributedString(string: value)
let range1 = NSRange(location: 0, length: var1Value.characters.count)
let range2 = NSRange(location: range1.length + middle, length: var2Value.characters.count)
text.addAttribute(NSForegroundColorAttributeName, value: var1Color, range: range1)
text.addAttribute(NSForegroundColorAttributeName, value: var2Color, range: range2)
result.attributedText = text
}
Should give you the coloring you want for var1 and var 2. It's important you apply text to attributedText of the label, not to text. Printing result.text should give you the content of the label, but printing result.attributedText will give you the weird nsAttributed string print.

Handling null value in a dictionary object

I am using a web service to get information. When said info is returned, I convert the data received from jason to a dict.
When Dumping dict object, some of the items arrive like this:
▿ (2 elements)
- key: "street1"
- value: <null> #4
How would i go about reading this data and knowing that the value is NULL
I have tried the following:
let street1:String = dict?["street1"] as! String
This fails with: Could not cast value of type 'NSNull' (0x10fbf7918) to 'NSString' (0x10f202c60).
The data could have a String value. So I tried:
let street1:Any = dict?["street1"] as Any
When I print street1 thus
street1: Optional()
I get the following
street1: Optional()
So my question is:
How would i go about reading this data and knowing that the value is null.
You can use if let for this type of nil check.
Try this instead:
if let street1 = dict?["street1"] as? String {
// If this succeeds then you can use street1 in here
print(street1)
}
Update:
var t_street1 = ""
if let street1 = dict?["street1"] as? String {
t_street1 = street1
}
No need for the else t_street1 is automatically empty since you assign it empty.

Extract String value from Node

I'm trying to query a database using something like this:
let db = drop.database?.driver as? MySQLDriver
let query = "select \(fields) from \(table) where \(condition)"
let result = try db.raw(query)
I get the following Node object as result:
array([Node.Node.object(["field_name": Node.Node.string("value_info")])])
How can I get the value_info into a String variable?
You can use PathIndexable to step into the result object, then Polymorphic to cast it as a string.
Should look something like this:
let valueInfo = result[0, "field_name"]?.string