How to import local Python package in Amazon Elastic MapReduce (EMR)? -
I have two Python scripts that are targeted to run on the Amazon Lösich Map - as a Mapper and one of the reducers in the form of. I have recently expanded the mapper script, which requires some more local models which I have created so that they both live in a package called SentimentAnalysis. What is the correct way to import a Python script from the local Python package on S3? I tried to create the S3 keys that try to copy my file system, hoping that the relative path will work, but do not regret. What I am looking for in the log files after step 1:
traceback (most recent call final): file "/ mnt / var / lib / hasoop / mapred / tasktracker / hadoop / Job Cache / Job_01407250000_0001 / try_201407250000_0001_m_000000_0 / work /./ sa_mapper.py ", in line 15, & lt; Module & gt; SentimentAnalysis Import NB, Error From LR Import: No SentimentAnalysis Name Module The related file structure is like this:
sa_mapper.py sa_reducer.py SentimentAnalysis / NB SentimentAnalysis / LR.py and mapper.py is: SentimentAnalysis Import NB, LR > from I Tried to mirror the file structure in S3, but it does not seem to work. What is the best way to set up S3 or EMR so that S_maproq can import NB.py and LR.py? Is there any special way to do this?
Do you have
__init__.py In the SentimentAnalysis folder?
Comments
Post a Comment