Uploading PySpark 2.1.1 to PyPi

classic Classic list List threaded Threaded
8 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Uploading PySpark 2.1.1 to PyPi

Holden Karau
Just a heads up I'm in the process of trying to upload the latest PySpark to PyPi (we are blocked on a tickets with the PyPi folks around file size but I'll follow up with them).

Relatedly PySpark is available in Conda-forge, currently 2.1.0 and there is a PR to update to 2.1.1 in process.

Happy Python Spark adventures every one :)
--
Cell : 425-233-8271
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Uploading PySpark 2.1.1 to PyPi

cloud0fan
Hi Holden,

Thanks for working on it! Do we have a JIRA ticket to track this? We should make it part of the release process in all the following Spark releases, and it will be great if we have a JIRA ticket to record the detailed steps of doing this and even automate it.

Thanks,
Wenchen
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Uploading PySpark 2.1.1 to PyPi

Holden Karau
So I have a PR to add this to the release process documentation - I'm waiting on the necessary approvals from PyPi folks before I merge that incase anything changes as a result of the discussion (like uploading to the legacy host or something). As for conda-forge, it's not something we need to do, but I'll add a note about pinging them when we make a new release so their users can keep up to date easily. The parent JIRA for PyPi related tasks is SPARK-18267 :)


On Mon, May 8, 2017 at 6:22 PM cloud0fan <[hidden email]> wrote:
Hi Holden,

Thanks for working on it! Do we have a JIRA ticket to track this? We should
make it part of the release process in all the following Spark releases, and
it will be great if we have a JIRA ticket to record the detailed steps of
doing this and even automate it.

Thanks,
Wenchen



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Uploading-PySpark-2-1-1-to-PyPi-tp21531p21532.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Uploading PySpark 2.1.1 to PyPi

Sameer Agarwal
Holden,

Thanks again for pushing this forward! Out of curiosity, did we get an approval from the PyPi folks?

Regards,
Sameer

On Mon, May 8, 2017 at 11:44 PM, Holden Karau <[hidden email]> wrote:
So I have a PR to add this to the release process documentation - I'm waiting on the necessary approvals from PyPi folks before I merge that incase anything changes as a result of the discussion (like uploading to the legacy host or something). As for conda-forge, it's not something we need to do, but I'll add a note about pinging them when we make a new release so their users can keep up to date easily. The parent JIRA for PyPi related tasks is SPARK-18267 :)


On Mon, May 8, 2017 at 6:22 PM cloud0fan <[hidden email]> wrote:
Hi Holden,

Thanks for working on it! Do we have a JIRA ticket to track this? We should
make it part of the release process in all the following Spark releases, and
it will be great if we have a JIRA ticket to record the detailed steps of
doing this and even automate it.

Thanks,
Wenchen



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Uploading-PySpark-2-1-1-to-PyPi-tp21531p21532.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]




--
Sameer Agarwal
Software Engineer | Databricks Inc.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Uploading PySpark 2.1.1 to PyPi

Xiao Li
Hi, Holden, 

Based on the PR, https://github.com/pypa/packaging-problems/issues/90 , the limit has been increased to 250MB. 

Just wondering if we can publish PySpark to PyPI now? Have you created the account? 

Thanks,

Xiao Li



2017-05-12 11:35 GMT-07:00 Sameer Agarwal <[hidden email]>:
Holden,

Thanks again for pushing this forward! Out of curiosity, did we get an approval from the PyPi folks?

Regards,
Sameer

On Mon, May 8, 2017 at 11:44 PM, Holden Karau <[hidden email]> wrote:
So I have a PR to add this to the release process documentation - I'm waiting on the necessary approvals from PyPi folks before I merge that incase anything changes as a result of the discussion (like uploading to the legacy host or something). As for conda-forge, it's not something we need to do, but I'll add a note about pinging them when we make a new release so their users can keep up to date easily. The parent JIRA for PyPi related tasks is SPARK-18267 :)


On Mon, May 8, 2017 at 6:22 PM cloud0fan <[hidden email]> wrote:
Hi Holden,

Thanks for working on it! Do we have a JIRA ticket to track this? We should
make it part of the release process in all the following Spark releases, and
it will be great if we have a JIRA ticket to record the detailed steps of
doing this and even automate it.

Thanks,
Wenchen



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Uploading-PySpark-2-1-1-to-PyPi-tp21531p21532.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]




--
Sameer Agarwal
Software Engineer | Databricks Inc.

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Uploading PySpark 2.1.1 to PyPi

Holden Karau
An account already exists, the PMC has the info for it. I think we will need to wait for the 2.2 artifacts to do the actual PyPI upload because of the local version string in 2.2.1, but rest assured this isn't something I've lost track of.

On Wed, May 24, 2017 at 12:11 AM Xiao Li <[hidden email]> wrote:
Hi, Holden, 

Based on the PR, https://github.com/pypa/packaging-problems/issues/90 , the limit has been increased to 250MB. 

Just wondering if we can publish PySpark to PyPI now? Have you created the account? 

Thanks,

Xiao Li



2017-05-12 11:35 GMT-07:00 Sameer Agarwal <[hidden email]>:
Holden,

Thanks again for pushing this forward! Out of curiosity, did we get an approval from the PyPi folks?

Regards,
Sameer

On Mon, May 8, 2017 at 11:44 PM, Holden Karau <[hidden email]> wrote:
So I have a PR to add this to the release process documentation - I'm waiting on the necessary approvals from PyPi folks before I merge that incase anything changes as a result of the discussion (like uploading to the legacy host or something). As for conda-forge, it's not something we need to do, but I'll add a note about pinging them when we make a new release so their users can keep up to date easily. The parent JIRA for PyPi related tasks is SPARK-18267 :)


On Mon, May 8, 2017 at 6:22 PM cloud0fan <[hidden email]> wrote:
Hi Holden,

Thanks for working on it! Do we have a JIRA ticket to track this? We should
make it part of the release process in all the following Spark releases, and
it will be great if we have a JIRA ticket to record the detailed steps of
doing this and even automate it.

Thanks,
Wenchen



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Uploading-PySpark-2-1-1-to-PyPi-tp21531p21532.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]




--
Sameer Agarwal
Software Engineer | Databricks Inc.

--
Cell : 425-233-8271
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Uploading PySpark 2.1.1 to PyPi

Xiao Li
Hi, Holden,

That sounds good to me! 

Thanks,

Xiao

2017-05-23 16:32 GMT-07:00 Holden Karau <[hidden email]>:
An account already exists, the PMC has the info for it. I think we will need to wait for the 2.2 artifacts to do the actual PyPI upload because of the local version string in 2.2.1, but rest assured this isn't something I've lost track of.

On Wed, May 24, 2017 at 12:11 AM Xiao Li <[hidden email]> wrote:
Hi, Holden, 

Based on the PR, https://github.com/pypa/packaging-problems/issues/90 , the limit has been increased to 250MB. 

Just wondering if we can publish PySpark to PyPI now? Have you created the account? 

Thanks,

Xiao Li



2017-05-12 11:35 GMT-07:00 Sameer Agarwal <[hidden email]>:
Holden,

Thanks again for pushing this forward! Out of curiosity, did we get an approval from the PyPi folks?

Regards,
Sameer

On Mon, May 8, 2017 at 11:44 PM, Holden Karau <[hidden email]> wrote:
So I have a PR to add this to the release process documentation - I'm waiting on the necessary approvals from PyPi folks before I merge that incase anything changes as a result of the discussion (like uploading to the legacy host or something). As for conda-forge, it's not something we need to do, but I'll add a note about pinging them when we make a new release so their users can keep up to date easily. The parent JIRA for PyPi related tasks is SPARK-18267 :)


On Mon, May 8, 2017 at 6:22 PM cloud0fan <[hidden email]> wrote:
Hi Holden,

Thanks for working on it! Do we have a JIRA ticket to track this? We should
make it part of the release process in all the following Spark releases, and
it will be great if we have a JIRA ticket to record the detailed steps of
doing this and even automate it.

Thanks,
Wenchen



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Uploading-PySpark-2-1-1-to-PyPi-tp21531p21532.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]




--
Sameer Agarwal
Software Engineer | Databricks Inc.

--
Cell : <a href="tel:(425)%20233-8271" value="+14252338271" target="_blank">425-233-8271

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Uploading PySpark 2.1.1 to PyPi

Matei Zaharia
Administrator
Didn't we want to upload 2.1.1 too? What is the local version string problem?

Matei

On May 26, 2017, at 10:11 AM, Xiao Li <[hidden email]> wrote:

Hi, Holden,

That sounds good to me! 

Thanks,

Xiao

2017-05-23 16:32 GMT-07:00 Holden Karau <[hidden email]>:
An account already exists, the PMC has the info for it. I think we will need to wait for the 2.2 artifacts to do the actual PyPI upload because of the local version string in 2.2.1, but rest assured this isn't something I've lost track of.

On Wed, May 24, 2017 at 12:11 AM Xiao Li <[hidden email]> wrote:
Hi, Holden, 

Based on the PR, https://github.com/pypa/packaging-problems/issues/90 , the limit has been increased to 250MB. 

Just wondering if we can publish PySpark to PyPI now? Have you created the account? 

Thanks,

Xiao Li



2017-05-12 11:35 GMT-07:00 Sameer Agarwal <[hidden email]>:
Holden,

Thanks again for pushing this forward! Out of curiosity, did we get an approval from the PyPi folks?

Regards,
Sameer

On Mon, May 8, 2017 at 11:44 PM, Holden Karau <[hidden email]> wrote:
So I have a PR to add this to the release process documentation - I'm waiting on the necessary approvals from PyPi folks before I merge that incase anything changes as a result of the discussion (like uploading to the legacy host or something). As for conda-forge, it's not something we need to do, but I'll add a note about pinging them when we make a new release so their users can keep up to date easily. The parent JIRA for PyPi related tasks is SPARK-18267 :)


On Mon, May 8, 2017 at 6:22 PM cloud0fan <[hidden email]> wrote:
Hi Holden,

Thanks for working on it! Do we have a JIRA ticket to track this? We should
make it part of the release process in all the following Spark releases, and
it will be great if we have a JIRA ticket to record the detailed steps of
doing this and even automate it.

Thanks,
Wenchen



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Uploading-PySpark-2-1-1-to-PyPi-tp21531p21532.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]




--
Sameer Agarwal
Software Engineer | Databricks Inc.

--
Cell : <a href="tel:(425)%20233-8271" value="+14252338271" target="_blank" class="">425-233-8271


Loading...