[Contributors Welcome] Add missing torch::nn layers in C++ API

@yf225 Interested in Convolution or Vision layers.

1 Like

@yf225 I’m interested in implementing the mentioned Distance functions :slightly_smiling_face:.

1 Like

I can work on linear layers

Interested in AdaptiveMaxPool layers

@yf225 I’m interested in all of the Vision layers

I‘d be interested in contributing to the non-linear activations.

I can get started with the distance functions cosine similarity and pairwise distance function

@yf225 @ptrblck I’m interested to contribute. Excited of the C++ frontend API. Other than obviously this example, do you have any other examples?

1 Like

@yf225 I’m interested in the Convolution layers and non-linear activations (basic+other)

Thank you all so much for your interests and I really appreciated it. I am currently working on cutting out the task for each layer (most of them involve adding the corresponding torch::nn::functional layer as well), and I will share with all of you (along with example PRs to get started) as soon as possible.


@yf225 : Would be happy to attempt to contribute code to Convolutions,Vision and Non-linearities if all tasks aren’t already picked up

Great to see the work on C++ API, I am not sure whether I am able to contribute in this but I could give a try, but definitely I could contribute on the use case, in which I will integrate it with other OSS (such as Scilab) to make the call to C++ API easier for others, especially beginners who have less programming background.


i want to contribute to Non-linear activations or dataparallel layers if still in time

I can try and work with the Normalization layer implementations.

I would like to work on ReflectionPad1d and ReflectionPad2d

Hi would love to work on the Loss Functions

Hi ,I would like to work on CrossEntropyLoss and NLLLoss.

@yf225, please let me know in case of any conflicts.

Thanks everyone for your interest! We now have many great example PRs at https://github.com/pytorch/pytorch/issues/25883, and please let us know (by responding in the Github issue) if there is any layer you would like to work on. :smiley:

1 Like

I would go with MXNet over PyTorch even though when using Python I prefer PyTorch. MXNet is open to having language bindings in multiple languages. Once you get a minimum viable product you could donate it to the main project. Bindings that are part of such a large well known project are likely to attract more contributors than bindings maintained under your account for PyTorch (I could not see PyTorch officially supporting Rust).