To demonstrate the problem, I have recreated the error in a Swift playground with a minimal example:
func aFuncTakingInt32(val: Int32) { // do stuff } let validNumber = 42 aFuncTakingInt32(validNumber)
This code generates the same error:
Cannot invoke 'aFuncTakingInt32' with an argument list of type '(Int)'
If we change validNumber's type to explicitly be Int32:
let validNumber: Int32 = 42
Or if we construct an Int32 out of the Int when we call the function:
aFuncTakingInt32(Int32(validNumber))
The error goes away and the code runs perfectly fine.
As an additional helpful detail to solving this problem, since the method we're trying to use is not defined in Swift, but instead in an Objective-C++ file we're importing, there IS a way to find out exactly what types of arguments the function expects on the Swift end of things. If you type the method name out, and hold the Alt key and click the method name, you'll get some useful information:

Now we can see the explicit type that Swift expects to be passing to that method. Here, it is clearly marked as an Int32 variable (which we already knew since we just wrote that method in Swift). This can be very useful when we don't have access to the source code or when we're not sure how Swift is translating code from one language into Swift.
Swift's Int type is different from the int primitive from Objective-C, C, C++, and many other languages. Swift's Int is more similar to Objective-C's NSInteger.
If we tried passing Objective-C's NSInteger to this C++ method, it would compile but it would probably generate a warning (versus Swift's error). That's because Objective-C is far less strict and is willing to do some implicit casting on primitives.
With that said, Swift has a handful of types that are more equivalent to C, C++, and Objective-C's int (and the other sizes of integers). Here, we most likely want Int32.
The reason we can use let result = refScen.getCalcLoad(42) is because while 42 is a literal that by default will be interpreted as an Int (as we see in the let validNumber = 42), it can also be interpreted as many other numeric types, including Int32 (and even floating point types).
However, when we do the following:
let validNumber = 42
Here, validNumber must implicitly infer an explicit type. And the type it is inferring is Int (which is different from Objective-C, C, & C++'s int).
So when we try the following:
let validNumber = 42 let result = refScen.getCalcLoad(validNumber)
We're trying to pass an Int to a method that expect an Int32, and because Swift is very, very strict about the types of the parameters it lets us pass, it generates an error (where Objective-C would potentially generate a warning).
We should be able to fix this by explicitly declaring validNumber's type as the appropriate, Int32:
let validNumber: Int32 = 42 let result = refScen.getCalcLoad(validNumber)
But additionally, if it doesn't make sense on the Swift end to keep this variable's type as an Int32, we can construct an Int32 out of an Int, so we can do something like this:
let validNumber = 42 let result = refScen.getCalcLoad(Int32(validNumber))
As an additional caveat, we don't run into this problem with a String because Objective-C's NSString maps to Swift's String. We also wouldn't run into this problem with char which maps to Swift's Character or bool which maps to Swift's Bool.