Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting Started guide for K8s incorrectly states minikube doesn't support LoadBalancer #11582

Open
biasedbit opened this issue Oct 11, 2023 · 3 comments · May be fixed by pulumi/pulumi-hugo#3521
Assignees
Labels
area/docs Improvements or additions to documentation grooming-q2y25 impact/first-48 This bug is likely to be hit during a user's first 48 hours of product evaluation needs-triage Needs attention from the triage team size/S Estimated effort to complete (1-2 days).
Milestone

Comments

@biasedbit
Copy link

biasedbit commented Oct 11, 2023

The Pulumi & Kubernetes guide can be updated to remove the conditional service logic for minikube. I got through the example using minikube with isMinikube set to false.

Steps to reproduce:

  1. In a separate terminal, run minikube tunnel (required for LoadBalancer to obtain IP 127.0.0.1)

  2. Follow all instructions with isMinikube set to false

  3. Modify frontend service external port to 8080:

    -- "ports": [{ "port": 80, "target_port": 80, "protocol": "TCP" }],
    ++ "ports": [{ "port": 8080, "target_port": 80, "protocol": "TCP" }],
  4. Run pulumi up

     Type                              Name               Status            
     +   pulumi:pulumi:Stack               pulumi-test-local  created (13s)     
     +   ├─ kubernetes:apps/v1:Deployment  nginx              created (2s)      
     +   └─ kubernetes:core/v1:Service     nginx              created (10s)     
    
    Outputs:
        ip: "127.0.0.1"
    
    Resources:
        + 3 created
    
  5. Use curl to validate setup:

    ❯ curl -I 127.0.0.1:8080
    HTTP/1.1 200 OK
    Server: nginx/1.25.2
    ...

Notes:

  • Used port 8080 instead of 80 as the latter will prompt for admin password input in minikube tunnel
  • minikube tunnel must be running, otherwise service creation will stall waiting for IP assignment

Env details

❯ uname -orm
Darwin 23.0.0 arm64

❯ minikube version
minikube version: v1.31.2
commit: fd7ecd9c4599bef9f04c0986c4a0187f98a4396e

❯ pulumi version
v3.88.0
@github-actions github-actions bot added the needs-triage Needs attention from the triage team label Oct 11, 2023
@interurban interurban added the area/docs Improvements or additions to documentation label Oct 11, 2023
@desteves desteves self-assigned this Oct 13, 2023
@desteves desteves removed the needs-triage Needs attention from the triage team label Oct 13, 2023
@desteves desteves added this to the 0.95 milestone Oct 13, 2023
@Rajakavitha1
Copy link

Hi @biasedbit !!!
I tried the instructions that you detailed and yes, it does work indeed. I also checked: https://minikube.sigs.k8s.io/docs/handbook/accessing/#loadbalancer-access.

However, the changes that I noticed in the step was that minikube tunnel does not prompt for any password. And curl "http://localhost:8080" does not work unless it is port forwarded.
pulumiminikube.txt
kubernetesservices.txt
testingthepulumistack.txt

A better approach to documentation update would be to remove the condition service logic isMinikube in all the code examples and mention about using minikube tunnel .
Also:
I noticed something else too. In the getting started there are no instructions to deploy an app. We are only providing instructions to check if the service is deployed and if we used YAML the isMinikube config need not be set. The deployment goes through. Here's the terminal output of the same.
yamltestwithoutisMinikube.txt

@ringods and @desteves please let me know your thought on this approach to documentation and I am more than happy to submit a PR with the changes.

@Rajakavitha1 Rajakavitha1 linked a pull request Oct 22, 2023 that will close this issue
5 tasks
@Rajakavitha1
Copy link

Just submitted a PR with changes for Python and YAML pulumi/pulumi-hugo#3521

@desteves desteves modified the milestones: 0.95, 0.96 Oct 25, 2023
@desteves desteves added the size/S Estimated effort to complete (1-2 days). label Oct 27, 2023
@mikhailshilkov mikhailshilkov modified the milestones: 0.96, 0.97 Nov 23, 2023
@desteves desteves modified the milestones: 0.97, 0.98 Dec 11, 2023
@desteves desteves modified the milestones: 0.98, 0.99 Jan 5, 2024
@desteves desteves added the impact/first-48 This bug is likely to be hit during a user's first 48 hours of product evaluation label Jan 9, 2024
@interurban interurban modified the milestones: 0.99, 0.100 Jan 30, 2024
@sean1588 sean1588 transferred this issue from pulumi/pulumi-hugo May 10, 2024
@thoward thoward added the needs-triage Needs attention from the triage team label Oct 11, 2024
@thoward
Copy link
Contributor

thoward commented Nov 9, 2024

Checking in here: @desteves (pulumi/pulumi-hugo#3841) and @Rajakavitha1 (pulumi/pulumi-hugo#3521) both had submitted PRs that would address this issue, however they were never merged, and the repo the PRs were on is now archived (pulumi-hugo). It seems like @desteves PR would supersede the previous one by @Rajakavitha1, though both are still "open".

@interurban We migrated the issues from that repo, but it seems the open PRs didn't migrate? This feels like a loss of good work. Should we move pulumi/pulumi-hugo#3841 here and get it merged? Or are do we prefer to abandon these PRs and keep the example as-is? If so, we should probably close this ticket as wont-fix.

@thoward thoward self-assigned this Dec 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/docs Improvements or additions to documentation grooming-q2y25 impact/first-48 This bug is likely to be hit during a user's first 48 hours of product evaluation needs-triage Needs attention from the triage team size/S Estimated effort to complete (1-2 days).
Projects
Status: 💜 Review
Development

Successfully merging a pull request may close this issue.

6 participants