Package org.apache.spark.sql.artifact
Class ArtifactManager
Object
org.apache.spark.sql.artifact.ArtifactManager
- All Implemented Interfaces:
org.apache.spark.internal.Logging
This class handles the storage of artifacts as well as preparing the artifacts for use.
Artifacts belonging to different SparkSessions are isolated from each other with the help of the
sessionUUID.
Jars and classfile artifacts are stored under "jars", "classes" and "pyfiles" sub-directories respectively while other types of artifacts are stored under the root directory for that particular SparkSession.
param: session The object used to hold the Spark Connect session state.
-
Nested Class Summary
Nested classes/interfaces inherited from interface org.apache.spark.internal.Logging
org.apache.spark.internal.Logging.LogStringContext, org.apache.spark.internal.Logging.SparkShellLoggingFilter -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionvoidaddArtifact(Path remoteRelativePath, Path serverLocalStagingPath, scala.Option<String> fragment) Add and prepare a staged artifact (i.e an artifact that has been rebuilt locally from bytes over the wire) for use.static StringReturns aClassLoaderfor session-specific jar/class file resources.static Stringscala.collection.immutable.Seq<URL>Get the URLs of all jar artifacts.scala.collection.immutable.Seq<String>Get the py-file names added to this SparkSession.static org.apache.spark.internal.Logging.LogStringContextLogStringContext(scala.StringContext sc) static org.slf4j.Loggerstatic voidorg$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) voiduploadArtifactToFs(Path remoteRelativePath, Path serverLocalStagingPath) <T> TwithResources(scala.Function0<T> f) Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface org.apache.spark.internal.Logging
initializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, isTraceEnabled, log, logDebug, logDebug, logDebug, logDebug, logError, logError, logError, logError, logInfo, logInfo, logInfo, logInfo, logName, LogStringContext, logTrace, logTrace, logTrace, logTrace, logWarning, logWarning, logWarning, logWarning, org$apache$spark$internal$Logging$$log_, org$apache$spark$internal$Logging$$log__$eq, withLogContext
-
Constructor Details
-
ArtifactManager
-
-
Method Details
-
forwardToFSPrefix
-
ARTIFACT_DIRECTORY_PREFIX
-
org$apache$spark$internal$Logging$$log_
public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_() -
org$apache$spark$internal$Logging$$log__$eq
public static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) -
LogStringContext
public static org.apache.spark.internal.Logging.LogStringContext LogStringContext(scala.StringContext sc) -
withResources
public <T> T withResources(scala.Function0<T> f) -
getAddedJars
Get the URLs of all jar artifacts.- Returns:
- (undocumented)
-
getPythonIncludes
Get the py-file names added to this SparkSession.- Returns:
-
addArtifact
public void addArtifact(Path remoteRelativePath, Path serverLocalStagingPath, scala.Option<String> fragment) Add and prepare a staged artifact (i.e an artifact that has been rebuilt locally from bytes over the wire) for use.- Parameters:
remoteRelativePath-serverLocalStagingPath-fragment-
-
classloader
Returns aClassLoaderfor session-specific jar/class file resources.- Returns:
- (undocumented)
-
uploadArtifactToFs
-