Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sbt package failed with unresolved dependency #203

Open
haojinIntel opened this issue May 11, 2021 · 5 comments
Open

sbt package failed with unresolved dependency #203

haojinIntel opened this issue May 11, 2021 · 5 comments

Comments

@haojinIntel
Copy link

I've installed sbt-0.13.15 in my environment. I trigger "sbt packge" and meet the following exceptions:
image
Is there anyone meet the similar issue? And how can I fix this problem.

@eavilaes
Copy link

eavilaes commented May 11, 2021

It seems that some dependencies repositories have shut down, so you have to manage dependencies for this jar manually.

We had to remove the sbt-spark-package plugin, by erasing it from project/plugins.sbt and include the Spark dependencies manually as in this picture. We changed a few things on the following files:

On build.sbt you must change this:
image

And this on project/plugins.sbt
image

That worked for us!

Edit: I forgot to mention that we had to remove the sbt-spark-package plugin.

@eavilaes
Copy link

@haojinIntel I forgot to mention that you must remove the sbt-spark-package plugin, I've edited the previous message 😄

@pingsutw
Copy link

pingsutw commented Jun 6, 2021

@evanye Whare could I download those jars, and where should I put those jars?
Sorry, I'm a beginner sbt.

@eavilaes
Copy link

eavilaes commented Jun 7, 2021

@evanye Whare could I download those jars, and where should I put those jars?
Sorry, I'm a beginner sbt.

You must build the jars as explained in https://github.com/databricks/spark-sql-perf#build

@AlessandroPomponio
Copy link

For anyone stumbling across this issue, it can be fixed by changing in project/plugins.sbt the line:
resolvers += "Spark Packages repo" at "https://dl.bintray.com/spark-packages/maven/"
to
resolvers += "Spark Packages repo" at "https://repos.spark-packages.org/"
As already noted in PRs #204 and #206

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants