5

Given:

class Foo[T] { def get: T } class Bar class FooBar extends Foo[Bar] { def get = new Bar } object Baz { def something [T, U <: Foo[T]] (foo : Class[U]): T = foo.newInstance.get } 

I should be able to do something like this, right?

Baz.something(classOf[FooBar]) 

Strangely this is throwing:

inferred type arguments [Nothing,this.FooBar] do not conform to method something's type parameter bounds [T,U <: this.Foo[T]] 

Which is weird :S. BTW I'm having this issue while migrating some java code that's equivalent to what I write here and it's working fine.

1
  • 2
    Foo class is missing an abstract keyword. Commented Oct 18, 2012 at 18:59

3 Answers 3

6

You've run into one of the more annoying limitations of Scala's type inference! See this answer for a clear explanation of why the compiler is choking here.

You have a handful of options. Most simply you can just provide the types yourself:

Baz.something[Bar, FooBar](classOf[FooBar]) 

But that's annoyingly verbose. If you really don't care about U, you can leave it out of the type argument list:

object Baz { def something[T](foo: Class[_ <: Foo[T]]): T = foo.newInstance.get } 

Now FooBar will be inferred correctly in your example. You can also use a trick discussed in the answer linked above:

object Baz { def something[T, U <% Foo[T]](foo: Class[U]): T = foo.newInstance.get } 

Why this works is a little tricky—the key is that after the view bound is desugared, T no longer appears in U's bound.

Sign up to request clarification or add additional context in comments.

Comments

5

It does not compile because T does not appear anywhere in the parameter list and thus cannot be inferred(or rather, it is inferred to Nothing).

You can fix it like this:

def something [T] (foo : Class[_ <: Foo[T]]): T = foo.newInstance.get 

1 Comment

Very true! Missed that detail.
0

Adding the types explicitly seems to work:

scala> Baz.something[Bar,FooBar](classOf[FooBar]) res1: Bar = Bar@3bb0ff0 

Usually, when you get a Nothing where you expect some other type to be inferred means that for some reason the compiler is not able to infer the type in that particular position.

That said, I propose the question: how could we refactor the code so the type gets inferred? @Régis' answer shows the solution.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.