Skip to main content
edited tags
Link
PolyGeo
  • 65.5k
  • 29
  • 115
  • 353
deleted 1 character in body
Source Link
Dean7
  • 358
  • 3
  • 23

Is it possible to create a python script to import shapefiles/feature classes, from different sources into specific feature datasets in geodatabase, but in one script?

I managed to write the script to make multiple feature datasets in geodatabase. This is the code:

>>> import arcpy arcpy.CreateFileGDB_management("D:/GIS_Temp", "TEST.gdb") fdList = ["Datase_A", "Dataset_B", "Dataset_C"] for fd in fdList: arcpy.CreateFeatureDataset_management("D:/GIS_Temp/TEST.gdb", fd, "D:/GIS_Temp/Projection.prj") 

And now i would like to import shapefiles/feature classes into specific datasets. For example, from "folder A", to "Dataset_A", "folder B" to "Dataset_B", and so on.

I could do it one script by one, like this:

>>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_A' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_A') 

But is it possible to do all it in one script. I am really new to python scripting, so I really don't know to combine those separate scripts and do the task.

If I use this code, i can get shapefiles into root of database, but not in feature dataset:

>>> import arcpy from arcpy import env import os env.workspace = "D:/GIS_Temp/Folder_A/" fcList = arcpy.ListFeatureClasses() for fc in fcList: arcpy.CopyFeatures_management(fc, "D:/GIS_Temp/Test.gdb/" + os.sep + fc.rstrip(".shp")) 

And if I add name of dataset like this, it won't run:

arcpy.CopyFeatures_management(fc, "D:/GIS_Temp/Test.gdb/Dataset_A" + os.sep + fc.rstrip(".shp")) 

This could workswork:

>>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_A' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_A') >>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_B' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_B') 

but only from one folder, to one dataset. I would like to write ONE code that would allow me to import defined shapefiles/classes, into specific feature datasets.

Is it possible to create a python script to import shapefiles/feature classes, from different sources into specific feature datasets in geodatabase, but in one script?

I managed to write the script to make multiple feature datasets in geodatabase. This is the code:

>>> import arcpy arcpy.CreateFileGDB_management("D:/GIS_Temp", "TEST.gdb") fdList = ["Datase_A", "Dataset_B", "Dataset_C"] for fd in fdList: arcpy.CreateFeatureDataset_management("D:/GIS_Temp/TEST.gdb", fd, "D:/GIS_Temp/Projection.prj") 

And now i would like to import shapefiles/feature classes into specific datasets. For example, from "folder A", to "Dataset_A", "folder B" to "Dataset_B", and so on.

I could do it one script by one, like this:

>>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_A' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_A') 

But is it possible to do all it in one script. I am really new to python scripting, so I really don't know to combine those separate scripts and do the task.

If I use this code, i can get shapefiles into root of database, but not in feature dataset:

>>> import arcpy from arcpy import env import os env.workspace = "D:/GIS_Temp/Folder_A/" fcList = arcpy.ListFeatureClasses() for fc in fcList: arcpy.CopyFeatures_management(fc, "D:/GIS_Temp/Test.gdb/" + os.sep + fc.rstrip(".shp")) 

And if I add name of dataset like this, it won't run:

arcpy.CopyFeatures_management(fc, "D:/GIS_Temp/Test.gdb/Dataset_A" + os.sep + fc.rstrip(".shp")) 

This could works:

>>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_A' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_A') >>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_B' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_B') 

but only from one folder, to one dataset. I would like to write ONE code that would allow me to import defined shapefiles/classes, into specific feature datasets.

Is it possible to create a python script to import shapefiles/feature classes, from different sources into specific feature datasets in geodatabase, but in one script?

I managed to write the script to make multiple feature datasets in geodatabase. This is the code:

>>> import arcpy arcpy.CreateFileGDB_management("D:/GIS_Temp", "TEST.gdb") fdList = ["Datase_A", "Dataset_B", "Dataset_C"] for fd in fdList: arcpy.CreateFeatureDataset_management("D:/GIS_Temp/TEST.gdb", fd, "D:/GIS_Temp/Projection.prj") 

And now i would like to import shapefiles/feature classes into specific datasets. For example, from "folder A", to "Dataset_A", "folder B" to "Dataset_B", and so on.

I could do it one script by one, like this:

>>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_A' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_A') 

But is it possible to do all it in one script. I am really new to python scripting, so I really don't know to combine those separate scripts and do the task.

If I use this code, i can get shapefiles into root of database, but not in feature dataset:

>>> import arcpy from arcpy import env import os env.workspace = "D:/GIS_Temp/Folder_A/" fcList = arcpy.ListFeatureClasses() for fc in fcList: arcpy.CopyFeatures_management(fc, "D:/GIS_Temp/Test.gdb/" + os.sep + fc.rstrip(".shp")) 

And if I add name of dataset like this, it won't run:

arcpy.CopyFeatures_management(fc, "D:/GIS_Temp/Test.gdb/Dataset_A" + os.sep + fc.rstrip(".shp")) 

This could work:

>>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_A' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_A') >>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_B' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_B') 

but only from one folder, to one dataset. I would like to write ONE code that would allow me to import defined shapefiles/classes, into specific feature datasets.

added 386 characters in body
Source Link
Dean7
  • 358
  • 3
  • 23

Is it possible to create a python script to import shapefiles/feature classes, from different sources into specific feature datasets in geodatabase, but in one script?

I managed to write the script to make multiple feature datasets in geodatabase. This is the code:

>>> import arcpy arcpy.CreateFileGDB_management("D:/GIS_Temp", "TEST.gdb") fdList = ["Datase_A", "Dataset_B", "Dataset_C"] for fd in fdList: arcpy.CreateFeatureDataset_management("D:/GIS_Temp/TEST.gdb", fd, "D:/GIS_Temp/Projection.prj") 

And now i would like to import shapefiles/feature classes into specific datasets. For example, from "folder A", to "Dataset_A", "folder B" to "Dataset_B", and so on.

I could do it one script by one, like this:

>>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_A' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_A') 

But is it possible to do all it in one script. I am really new to python scripting, so I really don't know to combine those separate scripts and do the task.

If I use this code, i can get shapefiles into root of database, but not in feature dataset:

>>> import arcpy from arcpy import env import os env.workspace = "D:/GIS_Temp/Folder_A/" fcList = arcpy.ListFeatureClasses() for fc in fcList: arcpy.CopyFeatures_management(fc, "D:/GIS_Temp/Test.gdb/" + os.sep + fc.rstrip(".shp")) 

And if I add name of dataset like this, it won't run:

arcpy.CopyFeatures_management(fc, "D:/GIS_Temp/Test.gdb/Dataset_A" + os.sep + fc.rstrip(".shp")) 

This could works:

>>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_A' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_A') >>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_B' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_B') 

but only from one folder, to one dataset. I would like to write ONE code that would allow me to import defined shapefiles/classes, into specific feature datasets.

Is it possible to create a python script to import shapefiles/feature classes, from different sources into specific feature datasets in geodatabase, but in one script?

I managed to write the script to make multiple feature datasets in geodatabase. This is the code:

>>> import arcpy arcpy.CreateFileGDB_management("D:/GIS_Temp", "TEST.gdb") fdList = ["Datase_A", "Dataset_B", "Dataset_C"] for fd in fdList: arcpy.CreateFeatureDataset_management("D:/GIS_Temp/TEST.gdb", fd, "D:/GIS_Temp/Projection.prj") 

And now i would like to import shapefiles/feature classes into specific datasets. For example, from "folder A", to "Dataset_A", "folder B" to "Dataset_B", and so on.

I could do it one script by one, like this:

>>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_A' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_A') 

But is it possible to do all it in one script. I am really new to python scripting, so I really don't know to combine those separate scripts and do the task.

If I use this code, i can get shapefiles into root of database, but not in feature dataset:

>>> import arcpy from arcpy import env import os env.workspace = "D:/GIS_Temp/Folder_A/" fcList = arcpy.ListFeatureClasses() for fc in fcList: arcpy.CopyFeatures_management(fc, "D:/GIS_Temp/Test.gdb/" + os.sep + fc.rstrip(".shp")) 

And if I add name of dataset like this, it won't run:

arcpy.CopyFeatures_management(fc, "D:/GIS_Temp/Test.gdb/Dataset_A" + os.sep + fc.rstrip(".shp")) 

Is it possible to create a python script to import shapefiles/feature classes, from different sources into specific feature datasets in geodatabase, but in one script?

I managed to write the script to make multiple feature datasets in geodatabase. This is the code:

>>> import arcpy arcpy.CreateFileGDB_management("D:/GIS_Temp", "TEST.gdb") fdList = ["Datase_A", "Dataset_B", "Dataset_C"] for fd in fdList: arcpy.CreateFeatureDataset_management("D:/GIS_Temp/TEST.gdb", fd, "D:/GIS_Temp/Projection.prj") 

And now i would like to import shapefiles/feature classes into specific datasets. For example, from "folder A", to "Dataset_A", "folder B" to "Dataset_B", and so on.

I could do it one script by one, like this:

>>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_A' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_A') 

But is it possible to do all it in one script. I am really new to python scripting, so I really don't know to combine those separate scripts and do the task.

If I use this code, i can get shapefiles into root of database, but not in feature dataset:

>>> import arcpy from arcpy import env import os env.workspace = "D:/GIS_Temp/Folder_A/" fcList = arcpy.ListFeatureClasses() for fc in fcList: arcpy.CopyFeatures_management(fc, "D:/GIS_Temp/Test.gdb/" + os.sep + fc.rstrip(".shp")) 

And if I add name of dataset like this, it won't run:

arcpy.CopyFeatures_management(fc, "D:/GIS_Temp/Test.gdb/Dataset_A" + os.sep + fc.rstrip(".shp")) 

This could works:

>>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_A' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_A') >>> import arcpy arcpy.env.workspace = 'D:/GIS_Temp/Folder_B' arcpy.FeatureClassToGeodatabase_conversion(["File_X", "File_Y"], 'D:/GIS_Temp/Test.gdb/Dataset_B') 

but only from one folder, to one dataset. I would like to write ONE code that would allow me to import defined shapefiles/classes, into specific feature datasets.

added 386 characters in body
Source Link
Dean7
  • 358
  • 3
  • 23
Loading
Source Link
Dean7
  • 358
  • 3
  • 23
Loading