Encountered "/" at line 1... When configuring Databricks data source for External Compute

Created by Matt Nolan, Modified on Tue, Jan 20 at 5:15 PM by Matt Nolan

If you are encountering an error such as the below when configuring a Databricks data source for External Compute, the error could be related to permissions; the Spark job execution may not have permission to read the file from dbfs.

Encountered " "/" "/ "" at line 1, column 1
Was expecting one of: <HINT>... 


The Spark connector expects SMS for mapping as either a string or as a path to an SMS file, controlled by a single parameter. For External Compute, Stardog writes the mapping to a SMS file in a temporary file in Databricks /FileStore/stardog/, then try to read it as part of spark job execution.


Refer to the Databricks documentation regarding access permissions to dbfs, particularly pertaining to the Access Mode configured in the Databricks compute cluster.


One option is to set the cluster Access Mode to shared, if your expectation is that you will have regular materializations from pipelines and ad-hoc ones from users. However, this then requires the job-submitting user to be given ANY FILE access for dbfs. 


A second option is to set the cluster Access Mode to Dedicated and tying that to a single user that submits the materialization jobs.


Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select at least one of the reasons
CAPTCHA verification is required.

Feedback sent

We appreciate your effort and will try to fix the article