Changes to how users are added and deleted#235
Conversation
📝 WalkthroughWalkthroughAdded bidirectional SQLAlchemy relationships between User and Workout; replaced HTTP-based profile image upload/delete with boto3 put_object/delete_object to DigitalOcean Spaces; added strict base64 validation and ClientError handling; renamed EditUser → EditUserById (now uses user_id: Int) and added Changes
Sequence DiagramssequenceDiagram
participant Client
participant GraphQL as GraphQL Server (CreateUser)
participant S3 as S3/Spaces (boto3)
participant DB as Database
Client->>GraphQL: CreateUser(net_id, encoded_image)
GraphQL->>GraphQL: base64.b64decode(encoded_image, validate=True)
GraphQL->>S3: put_object(Bucket, Key, Body, ContentType, ACL)
S3-->>GraphQL: Success / ClientError
GraphQL->>GraphQL: construct public Spaces URL
GraphQL->>DB: INSERT user (including photo_url)
DB-->>GraphQL: created user
GraphQL-->>Client: Return created user + photo_url
sequenceDiagram
participant Client
participant GraphQL as GraphQL Server (DeleteUserById)
participant DB as Database
participant S3 as S3/Spaces (boto3)
Client->>GraphQL: DeleteUserById(user_id)
GraphQL->>GraphQL: `@jwt_required`() / verify identity == user_id
GraphQL->>DB: Query user by id
DB-->>GraphQL: user record (contains net_id/encoded_image)
alt user.encoded_image present
GraphQL->>S3: delete_object(Bucket, Key)
S3-->>GraphQL: Success / ClientError
end
GraphQL->>DB: DELETE user record
DB-->>GraphQL: User deleted
GraphQL-->>Client: Deletion confirmed
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 4
🧹 Nitpick comments (2)
src/schema.py (2)
10-10: Unused import flagged by static analysis.
WeeklyChallengeModelis imported but never used. Consider removing this import.Proposed fix
-from src.models.weekly_challenge import WeeklyChallenge as WeeklyChallengeModel🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` at line 10, The import WeeklyChallengeModel in src/schema.py is unused; remove the line importing WeeklyChallenge from src.models.weekly_challenge (the symbol WeeklyChallengeModel) so there is no unused import left in the module.
833-837: Consider extracting S3 client initialization to avoid duplication.The boto3 client initialization is duplicated in
CreateUser(lines 833-837) andDeleteUserById(lines 1170-1174). Consider extracting to a helper function or module-level lazy singleton.Example helper function
def get_s3_client(): return boto3.client('s3', endpoint_url=os.getenv("DIGITAL_OCEAN_URL"), aws_access_key_id=os.getenv("DIGITAL_OCEAN_ACCESS"), aws_secret_access_key=os.getenv("DIGITAL_OCEAN_SECRET_ACCESS") )🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 833 - 837, Extract the duplicated boto3.client('s3', ...) initialization into a shared helper (e.g., get_s3_client) or a module-level lazy singleton (e.g., _s3_client with a get_s3_client() accessor) and update the CreateUser and DeleteUserById handlers to call that helper instead of inlining the client creation; ensure the helper reads the same env vars (DIGITAL_OCEAN_URL, DIGITAL_OCEAN_ACCESS, DIGITAL_OCEAN_SECRET_ACCESS) and returns a boto3 S3 client so both CreateUser and DeleteUserById use the single implementation.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/schema.py`:
- Around line 1078-1079: Re-enable JWT protection on the logWorkout mutation by
restoring the `@jwt_required`() decorator above the mutate method and enforce
token-based user identity: inside mutate (the logWorkout resolver) call the JWT
helper (e.g., get_jwt_identity or equivalent) to obtain the authenticated user
id and validate it against the incoming user_id (or reject if mismatched) so
that only the authenticated user can log their own workouts; also ensure the JWT
decorator import is present (from flask_jwt_extended import jwt_required,
get_jwt_identity) and add an appropriate unauthorized error path.
- Around line 1022-1023: The SetWorkoutGoals.mutate method has its
`@jwt_required`() decorator commented out; re-enable authentication by restoring
`@jwt_required`() above the mutate definition, ensure jwt_required is imported
from flask_jwt_extended (or the project's JWT helper), and inside mutate use
get_jwt_identity() (or equivalent) to verify the caller matches user_id (or has
permission) before applying changes to prevent unauthorized updates.
- Around line 1176-1182: The except block after s3.delete_object is using the
wrong message and silently swallowing failures; update the handler around
s3.delete_object to log a correct, informative message (e.g., "S3 delete error
for user {user.net_id}") including the exception details (ClientError e) and
propagate the failure instead of swallowing it—either by re-raising the
exception or returning/raising a controlled error so the caller can roll back
the DB delete; locate the try/except that calls s3.delete_object and modify the
ClientError handling accordingly.
- Around line 839-859: The code decodes and uploads encoded_image without
guarding for None and may leave final_photo_url undefined on exceptions; update
the logic around base64.b64decode(encoded_image), s3.put_object(...) and
final_photo_url so: first check if encoded_image is truthy before
decoding/uploading; if falsy, skip upload and set final_photo_url to None or a
default placeholder; catch ClientError and assign a safe final_photo_url (or
None) in the except block; remove the unused variable key; ensure
UserModel(name=..., net_id=..., email=..., encoded_image=final_photo_url) always
receives a defined value.
---
Nitpick comments:
In `@src/schema.py`:
- Line 10: The import WeeklyChallengeModel in src/schema.py is unused; remove
the line importing WeeklyChallenge from src.models.weekly_challenge (the symbol
WeeklyChallengeModel) so there is no unused import left in the module.
- Around line 833-837: Extract the duplicated boto3.client('s3', ...)
initialization into a shared helper (e.g., get_s3_client) or a module-level lazy
singleton (e.g., _s3_client with a get_s3_client() accessor) and update the
CreateUser and DeleteUserById handlers to call that helper instead of inlining
the client creation; ensure the helper reads the same env vars
(DIGITAL_OCEAN_URL, DIGITAL_OCEAN_ACCESS, DIGITAL_OCEAN_SECRET_ACCESS) and
returns a boto3 S3 client so both CreateUser and DeleteUserById use the single
implementation.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 42acbf42-23f4-498e-a164-3409b2d55ca1
📒 Files selected for processing (4)
requirements.txtsrc/models/user.pysrc/models/workout.pysrc/schema.py
There was a problem hiding this comment.
♻️ Duplicate comments (4)
src/schema.py (4)
1031-1032:⚠️ Potential issue | 🟠 MajorSecurity: JWT protection disabled on
SetWorkoutGoalsmutation.The
@jwt_required()decorator is commented out, allowing unauthenticated users to modify any user's workout goals. Re-enable authentication before merging.Proposed fix
- # `@jwt_required`() + `@jwt_required`() def mutate(self, info, user_id, workout_goal):🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 1031 - 1032, The SetWorkoutGoals mutation currently has its authentication disabled: uncomment and restore the `@jwt_required`() decorator above the mutate method (the mutate function inside the SetWorkoutGoals class) so that JWT-authenticated users are required to call this mutation; ensure the decorator from flask_jwt_extended (or your JWT library) is imported and applied directly above def mutate(self, info, user_id, workout_goal) to enforce authentication.
843-869:⚠️ Potential issue | 🔴 CriticalCritical: Incorrect URL construction and missing None-guard for
encoded_image.Multiple issues in this code segment:
URL is incorrect (new issue): The constructed URL on line 863 is missing the bucket name. The upload goes to bucket
appdev-uploadwith keyuplift-dev/user-profile/{net_id}-profile.png, but the URL omits the bucket. This will result in broken profile image links.
encoded_imagecan beNone: Sinceencoded_imageis optional (required=Falseon line 833), callingbase64.b64decode(encoded_image)will raiseTypeErrorwhenNoneis passed.
final_photo_urlundefined on error: IfClientErroroccurs,final_photo_urlis never assigned, causingNameErroron line 869.Dead code:
response(line 854) andkey(line 862) are assigned but never used.Proposed fix
s3 = boto3.client('s3', endpoint_url=os.getenv("DIGITAL_OCEAN_URL"), aws_access_key_id=os.getenv("DIGITAL_OCEAN_ACCESS"), aws_secret_access_key=os.getenv("DIGITAL_OCEAN_SECRET_ACCESS") ) - # Decode the base64 image - image_data = base64.b64decode(encoded_image) - - # Upload to Spaces - try: - response = s3.put_object( - Bucket="appdev-upload", - Key=f"uplift-dev/user-profile/{net_id}-profile.png", - Body=image_data, - ContentType="image/png", - ACL="public-read" - ) - - key = "uplift-dev/photo.jpg" - final_photo_url = f"https://nyc3.digitaloceanspaces.com/uplift-dev/user-profile/{net_id}-profile.png" - - - except ClientError as e: - print("Upload error:", e) - - new_user = UserModel(name=name, net_id=net_id, email=email, encoded_image=final_photo_url) + final_photo_url = None + if encoded_image: + # Decode the base64 image + image_data = base64.b64decode(encoded_image) + + # Upload to Spaces + try: + s3.put_object( + Bucket="appdev-upload", + Key=f"uplift-dev/user-profile/{net_id}-profile.png", + Body=image_data, + ContentType="image/png", + ACL="public-read" + ) + final_photo_url = f"https://appdev-upload.nyc3.digitaloceanspaces.com/uplift-dev/user-profile/{net_id}-profile.png" + except ClientError as e: + raise GraphQLError(f"Failed to upload profile image: {e}") + + new_user = UserModel(name=name, net_id=net_id, email=email, encoded_image=final_photo_url)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 843 - 869, Guard against encoded_image being None before calling base64.b64decode and only attempt S3 upload when encoded_image is present; in the upload block use a single canonical key (e.g. key = f"uplift-dev/user-profile/{net_id}-profile.png") and construct final_photo_url using the bucket name and key (include "appdev-upload" in the URL) so it points to the actual object, handle ClientError by setting final_photo_url to a safe fallback (None or empty string) inside the except block, and remove unused variables (response and the previous unused key assignment) so only the used key and final_photo_url remain when creating the UserModel.
1083-1084:⚠️ Potential issue | 🟠 MajorSecurity: JWT protection disabled on
logWorkoutmutation.The
@jwt_required()decorator is commented out, allowing unauthenticated users to log workouts for any user. Re-enable authentication before merging.Proposed fix
- # `@jwt_required`() + `@jwt_required`() def mutate(self, info, workout_time, user_id, facility_id):🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 1083 - 1084, The logWorkout GraphQL mutation currently has its authentication disabled because the `@jwt_required`() decorator is commented out; re-enable authentication by uncommenting and applying `@jwt_required`() to the mutate method (the method named mutate on the logWorkout mutation class) and ensure the resolver uses the authenticated identity (e.g., read current_user or info.context after JWT validation) rather than trusting the passed user_id parameter to prevent unauthenticated or spoofed workout logs.
1181-1187:⚠️ Potential issue | 🟡 MinorIncorrect error message and silently swallowed delete failure.
Line 1187 prints "Upload error" but this is a delete operation. Additionally, silently swallowing the
ClientErrormeans the user is deleted from the database but their profile image may remain in cloud storage as an orphan.Consider either propagating the error or logging it correctly and proceeding with a warning.
Proposed fix
try: s3.delete_object( Bucket="appdev-upload", Key=f"uplift-dev/user-profile/{user.net_id}-profile.png", ) except ClientError as e: - print("Upload error:", e) + print(f"Warning: Failed to delete profile image for user {user.net_id}: {e}") + # Proceeding with user deletion despite image deletion failure🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 1181 - 1187, The delete_object error handler incorrectly logs "Upload error" and swallows failures; update the except block for s3.delete_object to log a correct, descriptive message (e.g., "S3 delete_object failed for uplift-dev/user-profile/{user.net_id}-profile.png") including the caught ClientError details and either re-raise the exception or log as a warning depending on desired flow; locate the try/except around s3.delete_object (references: s3.delete_object, ClientError, user.net_id) and change the print to use the process logger or raise the error so orphaned files are not silently ignored.
🧹 Nitpick comments (2)
src/schema.py (2)
11-11: Remove unused import.
WeeklyChallengeModelis imported but never used in this file. As per static analysis hint (F401).Proposed fix
-from src.models.weekly_challenge import WeeklyChallenge as WeeklyChallengeModel🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` at line 11, The import WeeklyChallengeModel in src/schema.py is unused; remove the line "from src.models.weekly_challenge import WeeklyChallenge as WeeklyChallengeModel" from the file to satisfy the F401 lint and avoid dead imports, and run tests/lint to ensure no remaining references to WeeklyChallengeModel in src/schema.py.
38-38: Remove duplicate import.
base64is already imported on line 2. As per static analysis hint (F811).Proposed fix
import boto3 from botocore.exceptions import ClientError -import base64🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` at line 38, The file contains a duplicate import of the base64 module (another import already exists near the top of src/schema.py); remove the redundant "import base64" statement shown in the diff so only the single original import remains, ensuring no F811 duplicate-import lint error.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@src/schema.py`:
- Around line 1031-1032: The SetWorkoutGoals mutation currently has its
authentication disabled: uncomment and restore the `@jwt_required`() decorator
above the mutate method (the mutate function inside the SetWorkoutGoals class)
so that JWT-authenticated users are required to call this mutation; ensure the
decorator from flask_jwt_extended (or your JWT library) is imported and applied
directly above def mutate(self, info, user_id, workout_goal) to enforce
authentication.
- Around line 843-869: Guard against encoded_image being None before calling
base64.b64decode and only attempt S3 upload when encoded_image is present; in
the upload block use a single canonical key (e.g. key =
f"uplift-dev/user-profile/{net_id}-profile.png") and construct final_photo_url
using the bucket name and key (include "appdev-upload" in the URL) so it points
to the actual object, handle ClientError by setting final_photo_url to a safe
fallback (None or empty string) inside the except block, and remove unused
variables (response and the previous unused key assignment) so only the used key
and final_photo_url remain when creating the UserModel.
- Around line 1083-1084: The logWorkout GraphQL mutation currently has its
authentication disabled because the `@jwt_required`() decorator is commented out;
re-enable authentication by uncommenting and applying `@jwt_required`() to the
mutate method (the method named mutate on the logWorkout mutation class) and
ensure the resolver uses the authenticated identity (e.g., read current_user or
info.context after JWT validation) rather than trusting the passed user_id
parameter to prevent unauthenticated or spoofed workout logs.
- Around line 1181-1187: The delete_object error handler incorrectly logs
"Upload error" and swallows failures; update the except block for
s3.delete_object to log a correct, descriptive message (e.g., "S3 delete_object
failed for uplift-dev/user-profile/{user.net_id}-profile.png") including the
caught ClientError details and either re-raise the exception or log as a warning
depending on desired flow; locate the try/except around s3.delete_object
(references: s3.delete_object, ClientError, user.net_id) and change the print to
use the process logger or raise the error so orphaned files are not silently
ignored.
---
Nitpick comments:
In `@src/schema.py`:
- Line 11: The import WeeklyChallengeModel in src/schema.py is unused; remove
the line "from src.models.weekly_challenge import WeeklyChallenge as
WeeklyChallengeModel" from the file to satisfy the F401 lint and avoid dead
imports, and run tests/lint to ensure no remaining references to
WeeklyChallengeModel in src/schema.py.
- Line 38: The file contains a duplicate import of the base64 module (another
import already exists near the top of src/schema.py); remove the redundant
"import base64" statement shown in the diff so only the single original import
remains, ensuring no F811 duplicate-import lint error.
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/schema.py (1)
1169-1173:⚠️ Potential issue | 🟠 MajorAdd
@jwt_required()decorator toDeleteUserById.mutate().The mutation currently allows deleting any user by ID without authentication. This is inconsistent with other user-modifying mutations like
SetWorkoutGoalsandLogWorkout, which are protected with@jwt_required(). A public endpoint that can delete any user is a security risk.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 1169 - 1173, The DeleteUserById.mutate method is missing authentication; add the `@jwt_required`() decorator to the DeleteUserById.mutate method so the mutation requires a valid JWT, and ensure you import jwt_required (e.g., from flask_jwt_extended import jwt_required) at the top of the module; update the decorator directly above the mutate method in the DeleteUserById class and run tests to confirm other auth logic (current_user access, permission checks) still works.
♻️ Duplicate comments (2)
src/schema.py (2)
1181-1187:⚠️ Potential issue | 🟡 MinorIncorrect error message and silently swallowed failure.
The issues from the previous review remain:
- Line 1187 prints "Upload error" but this is a delete operation.
- The
ClientErroris swallowed, so the user is deleted from the database while their profile image may remain orphaned in cloud storage.Consider either propagating the error or at minimum logging correctly:
Proposed fix
try: s3.delete_object( Bucket="appdev-upload", Key=f"uplift-dev/user-profile/{user.net_id}-profile.png", ) except ClientError as e: - print("Upload error:", e) + # Log but continue - deleting the user is more important than cleaning up the image + print(f"Warning: Failed to delete profile image for user {user.net_id}: {e}")🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 1181 - 1187, The delete-block around s3.delete_object is swallowing ClientError and has the wrong log text; update the except handler for s3.delete_object (Bucket "appdev-upload", Key f"uplift-dev/user-profile/{user.net_id}-profile.png") to log an accurate message (e.g., "Delete profile image error") including the exception details and then either re-raise the ClientError or propagate it to the caller instead of silently continuing so the user record isn't deleted while the S3 object may remain; ensure you reference ClientError and user.net_id in the handler so the caller can handle or abort the DB delete upon failure.
849-867:⚠️ Potential issue | 🔴 CriticalCritical:
encoded_imagecan beNoneandfinal_photo_urlmay be undefined.The issues from the previous review remain unaddressed:
encoded_imageis optional (required=False), butbase64.b64decode(encoded_image)will raiseTypeErrorifencoded_imageisNone.- If a
ClientErroroccurs,final_photo_urlis never assigned, causingNameErroron line 869.- Lines 854 and 862 assign
responseandkeybut neither is used.Proposed fix
+ final_photo_url = None + if encoded_image: - # Decode the base64 image - image_data = base64.b64decode(encoded_image) - - # Upload to Spaces - try: - response = s3.put_object( - Bucket="appdev-upload", - Key=f"uplift-dev/user-profile/{net_id}-profile.png", - Body=image_data, - ContentType="image/png", - ACL="public-read" - ) - - key = "uplift-dev/photo.jpg" - final_photo_url = f"https://nyc3.digitaloceanspaces.com/uplift-dev/user-profile/{net_id}-profile.png" - - - except ClientError as e: - print("Upload error:", e) + # Decode the base64 image + image_data = base64.b64decode(encoded_image) + + # Upload to Spaces + try: + s3.put_object( + Bucket="appdev-upload", + Key=f"uplift-dev/user-profile/{net_id}-profile.png", + Body=image_data, + ContentType="image/png", + ACL="public-read" + ) + final_photo_url = f"https://appdev-upload.nyc3.digitaloceanspaces.com/uplift-dev/user-profile/{net_id}-profile.png" + except ClientError as e: + raise GraphQLError(f"Failed to upload profile image: {e}")🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 849 - 867, Handle the optional encoded_image and ensure final_photo_url is always defined: before calling base64.b64decode(encoded_image) in the block using encoded_image and s3.put_object, check if encoded_image is None and either skip the upload (set final_photo_url to a sensible default/None) or return/raise a clear error; initialize final_photo_url to a safe default before the try so a ClientError cannot leave it undefined; remove or use the unused locals response and key (either drop them or use response/Key in further logic) and catch ClientError from botocore.exceptions to log the exception (e) while keeping final_photo_url defined.
🧹 Nitpick comments (2)
src/schema.py (2)
11-11: Remove unused and duplicate imports.Static analysis flagged:
- Line 11:
WeeklyChallengeModelis imported but never used in this file.- Line 38:
base64is imported again, duplicating line 2.Proposed fix
-from src.models.weekly_challenge import WeeklyChallenge as WeeklyChallengeModelimport boto3 from botocore.exceptions import ClientError -import base64Also applies to: 36-38
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` at line 11, Remove the unused import WeeklyChallengeModel and the duplicate base64 import: delete the line that imports WeeklyChallenge as WeeklyChallengeModel (it's never referenced) and remove the second/base64 re-import so only a single import of base64 remains; confirm only necessary imports exist at the top of the module and run linters/type-checks to verify no remaining unused imports.
843-847: Consider extracting S3 client initialization and constants.The S3 client creation and bucket/path configurations are duplicated. Extracting these would reduce duplication and make configuration changes easier.
Suggested approach
# At module level or in a separate utils module SPACES_BUCKET = "appdev-upload" PROFILE_IMAGE_PREFIX = "uplift-dev/user-profile" def get_s3_client(): return boto3.client( 's3', endpoint_url=os.getenv("DIGITAL_OCEAN_URL"), aws_access_key_id=os.getenv("DIGITAL_OCEAN_ACCESS"), aws_secret_access_key=os.getenv("DIGITAL_OCEAN_SECRET_ACCESS") ) def get_profile_image_key(net_id: str) -> str: return f"{PROFILE_IMAGE_PREFIX}/{net_id}-profile.png" def get_profile_image_url(net_id: str) -> str: return f"https://{SPACES_BUCKET}.nyc3.digitaloceanspaces.com/{PROFILE_IMAGE_PREFIX}/{net_id}-profile.png"Also applies to: 1175-1179
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 843 - 847, The S3 client creation (variable s3) and hard-coded bucket/path strings are duplicated; extract them into module-level constants (e.g., SPACES_BUCKET, PROFILE_IMAGE_PREFIX) and a factory function get_s3_client() that returns the boto3.client configured with DIGITAL_OCEAN_URL/ACCESS/SECRET, and add helper functions get_profile_image_key(net_id) and get_profile_image_url(net_id) to build keys/URLs; then replace direct instantiations of s3 and inlined path strings (the s3 = boto3.client(...) usages and the profile path constructions currently duplicated at the shown diff and around lines 1175-1179) with calls to get_s3_client(), get_profile_image_key(), and get_profile_image_url() so configuration is centralized and duplication removed.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/schema.py`:
- Line 863: The constructed final_photo_url is missing the bucket and CDN
domain; update how final_photo_url is built (the variable final_photo_url in
this file) to use the canonical DigitalOcean Spaces format used elsewhere
(ASSET_BASE_URL from constants.py) or explicitly include the bucket
"appdev-upload" and the CDN host (e.g.,
{bucket}.{region}.cdn.digitaloceanspaces.com) so the URL becomes
"https://appdev-upload.{region}.cdn.digitaloceanspaces.com/user-profile/{net_id}-profile.png"
(use the existing ASSET_BASE_URL constant if available to assemble the path with
net_id and the same user-profile path).
---
Outside diff comments:
In `@src/schema.py`:
- Around line 1169-1173: The DeleteUserById.mutate method is missing
authentication; add the `@jwt_required`() decorator to the DeleteUserById.mutate
method so the mutation requires a valid JWT, and ensure you import jwt_required
(e.g., from flask_jwt_extended import jwt_required) at the top of the module;
update the decorator directly above the mutate method in the DeleteUserById
class and run tests to confirm other auth logic (current_user access, permission
checks) still works.
---
Duplicate comments:
In `@src/schema.py`:
- Around line 1181-1187: The delete-block around s3.delete_object is swallowing
ClientError and has the wrong log text; update the except handler for
s3.delete_object (Bucket "appdev-upload", Key
f"uplift-dev/user-profile/{user.net_id}-profile.png") to log an accurate message
(e.g., "Delete profile image error") including the exception details and then
either re-raise the ClientError or propagate it to the caller instead of
silently continuing so the user record isn't deleted while the S3 object may
remain; ensure you reference ClientError and user.net_id in the handler so the
caller can handle or abort the DB delete upon failure.
- Around line 849-867: Handle the optional encoded_image and ensure
final_photo_url is always defined: before calling
base64.b64decode(encoded_image) in the block using encoded_image and
s3.put_object, check if encoded_image is None and either skip the upload (set
final_photo_url to a sensible default/None) or return/raise a clear error;
initialize final_photo_url to a safe default before the try so a ClientError
cannot leave it undefined; remove or use the unused locals response and key
(either drop them or use response/Key in further logic) and catch ClientError
from botocore.exceptions to log the exception (e) while keeping final_photo_url
defined.
---
Nitpick comments:
In `@src/schema.py`:
- Line 11: Remove the unused import WeeklyChallengeModel and the duplicate
base64 import: delete the line that imports WeeklyChallenge as
WeeklyChallengeModel (it's never referenced) and remove the second/base64
re-import so only a single import of base64 remains; confirm only necessary
imports exist at the top of the module and run linters/type-checks to verify no
remaining unused imports.
- Around line 843-847: The S3 client creation (variable s3) and hard-coded
bucket/path strings are duplicated; extract them into module-level constants
(e.g., SPACES_BUCKET, PROFILE_IMAGE_PREFIX) and a factory function
get_s3_client() that returns the boto3.client configured with
DIGITAL_OCEAN_URL/ACCESS/SECRET, and add helper functions
get_profile_image_key(net_id) and get_profile_image_url(net_id) to build
keys/URLs; then replace direct instantiations of s3 and inlined path strings
(the s3 = boto3.client(...) usages and the profile path constructions currently
duplicated at the shown diff and around lines 1175-1179) with calls to
get_s3_client(), get_profile_image_key(), and get_profile_image_url() so
configuration is centralized and duplication removed.
JoshD94
left a comment
There was a problem hiding this comment.
lgtm after we fix the 3 issues found by coderabbit
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/schema.py`:
- Line 11: Remove the unused import WeeklyChallengeModel and the duplicate
base64 import to fix the lint regressions: delete the line importing
WeeklyChallenge as WeeklyChallengeModel (since it isn't referenced anywhere) and
remove the redundant second import of base64 so only a single base64 import
remains; ensure there are no other references to WeeklyChallengeModel before
removing.
- Around line 1178-1194: Add an authorization/ownership check before calling
s3.delete_object so only the account owner or an authorized admin can delete the
profile image: fetch the current authenticated user (e.g., from context or auth
helper used elsewhere in schema.py), compare its identifier to the target "user"
(or check an is_admin flag) and raise GraphQLError("Not authorized") if the
caller is not the owner/admin; only proceed to call s3.delete_object for
user.net_id when the check passes (leave the existing ClientError handling
intact).
- Around line 856-868: The upload code currently hardcodes an object Key (path)
and builds final_photo_url, but EditUser still calls the legacy upload contract
(sending only bucket and image) and DeleteUserById re-derives the fixed key —
causing orphaned avatars; fix by unifying the storage contract: either (A) stop
hardcoding Key and instead call the same legacy upload service from this code
and use the returned URL (so the stored value matches what EditUser saves), or
(B) change EditUser to send the object Key and bucket (not the raw image) so it
stores the same path this code writes; update DeleteUserById to delete using the
stored URL (parse bucket and key from final_photo_url or from stored fields) and
ensure functions EditUser and DeleteUserById and the upload code that builds
final_photo_url/path all use the same bucket+key format.
There was a problem hiding this comment.
Actionable comments posted: 4
♻️ Duplicate comments (1)
src/schema.py (1)
1176-1181:⚠️ Potential issue | 🔴 Critical
DeleteUserByIdalso needs owner/admin authorization before deletion.
@jwt_required()authenticates the caller but does not prevent deleting other users by ID.🔐 Proposed fix
`@jwt_required`() def mutate(self, info, user_id): + caller_id = int(get_jwt_identity()) + if caller_id != int(user_id): + raise GraphQLError("Not authorized.") + # Check if user exists user = User.get_query(info).filter(UserModel.id == user_id).first()🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 1176 - 1181, The DeleteUserById mutate currently only has `@jwt_required`() which authenticates but doesn't authorize; update mutate to fetch the caller identity (e.g., via get_jwt_identity() or current_user) and the caller User record (use User.get_query/info), then allow deletion only if caller.id == user_id OR caller.UserModel.is_admin is true; otherwise raise GraphQLError("Not authorized to delete this user."). Keep the existing user lookup (User.get_query(...).filter(UserModel.id == user_id)) and perform the authorization check before performing the delete.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/schema.py`:
- Around line 844-848: The boto3.client(...) call for s3 (and the two other
client initializations) has incorrect multi-line indentation causing E128/E124;
reformat each call so keyword arguments are either all on one line or use a
hanging indent: put the opening parenthesis after boto3.client and place each
keyword argument on its own line aligned with one another (e.g.,
boto3.client('s3',\n endpoint_url=...,\n aws_access_key_id=...,\n
aws_secret_access_key=...\n) ), ensuring the closing parenthesis aligns with the
start of the call; update the s3 assignment and the other two client
initializations (the other boto3.client(...) occurrences) accordingly.
- Around line 850-851: The code calls base64.b64decode(encoded_image) without
validation, causing unhandled exceptions for malformed input; wrap the decode in
a try/except that catches binascii.Error and ValueError around the
base64.b64decode(encoded_image) call (the assignment to image_data) in both
create and edit flows, log the decoding error, and return/raise a controlled
user-facing error (e.g., GraphQLError or a typed validation error) before any S3
upload function is called so malformed data never reaches upload_to_s3 or
similar upload logic.
- Around line 888-893: The mutate method in EditUserById currently only requires
a valid JWT but does not verify that the caller may edit the target user; update
mutate to fetch the current user identity from the JWT (e.g. get_jwt_identity()
or info.context current user), then enforce authorization by allowing the update
only if current_user.id == user_id or current_user has an admin role (check
existing_user.role or current_user.role), otherwise raise a GraphQLError("Not
authorized"); keep the existing existence check on existing_user and apply the
ownership/admin check before performing any updates.
- Around line 909-911: In EditUserById.mutate the variable net_id is undefined
when building the upload path; replace it with a defined identifier from the
resolved user or input (for example use the fetched user object's net_id:
user.net_id) and ensure user is present before using it; update the path
f"uplift-dev/user-profile/{net_id}-profile.png" to
f"uplift-dev/user-profile/{user.net_id}-profile.png" (or fall back to another
stable id/raise a clear error if net_id missing) and use that same resolved
value for the put_object call.
---
Duplicate comments:
In `@src/schema.py`:
- Around line 1176-1181: The DeleteUserById mutate currently only has
`@jwt_required`() which authenticates but doesn't authorize; update mutate to
fetch the caller identity (e.g., via get_jwt_identity() or current_user) and the
caller User record (use User.get_query/info), then allow deletion only if
caller.id == user_id OR caller.UserModel.is_admin is true; otherwise raise
GraphQLError("Not authorized to delete this user."). Keep the existing user
lookup (User.get_query(...).filter(UserModel.id == user_id)) and perform the
authorization check before performing the delete.
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (4)
src/schema.py (4)
1493-1493: Minor: Description says "new user" but should be "existing user".The description text is slightly misleading since this mutation edits existing users, not new ones.
📝 Suggested fix
- edit_user = EditUserById.Field(description="Edit a new user by id.") + edit_user = EditUserById.Field(description="Edit an existing user by id.")🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` at line 1493, Update the description on the GraphQL field declaration for edit_user (edit_user = EditUserById.Field) to correctly refer to an existing user rather than a new one; replace the current description "Edit a new user by id." with "Edit an existing user by id." so EditUserById.Field has an accurate description.
854-875: Chain exceptions to preserve traceback context.The
raise GraphQLError(...)statements withinexceptblocks should use exception chaining to preserve the original traceback for debugging.🧹 Proposed fix for exception chaining
try: image_data = base64.b64decode(encoded_image, validate=True) except (binascii.Error, ValueError) as err: - raise GraphQLError("Invalid profile image encoding.") + raise GraphQLError("Invalid profile image encoding.") from err try: bucket = "appdev-upload" ... except ClientError as e: print("Upload error:", e) - raise GraphQLError("Error uploading user profile picture.") + raise GraphQLError("Error uploading user profile picture.") from e🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 854 - 875, The except handlers for decoding and upload drop the original exceptions; update both exception raises to chain the original errors by re-raising GraphQLError with "from err" / "from e" respectively so the original traceback is preserved (affecting the try/except blocks around base64.b64decode(encoded_image, validate=True) and the S3 upload via s3.put_object where ClientError is caught); ensure the GraphQLError messages remain the same but use "raise GraphQLError(... ) from err" for the binascii/ValueError handler and "raise GraphQLError(... ) from e" for the ClientError handler.
1198-1214: Move S3 client initialization inside the conditional block.The boto3 client is initialized even when
user.encoded_imageis falsy and no deletion is needed. Moving it inside theifblock avoids unnecessary client setup.♻️ Proposed refactor
- s3 = boto3.client( - "s3", - endpoint_url=os.getenv("DIGITAL_OCEAN_URL"), - aws_access_key_id=os.getenv("DIGITAL_OCEAN_ACCESS"), - aws_secret_access_key=os.getenv("DIGITAL_OCEAN_SECRET_ACCESS") - ) - if user.encoded_image: + s3 = boto3.client( + "s3", + endpoint_url=os.getenv("DIGITAL_OCEAN_URL"), + aws_access_key_id=os.getenv("DIGITAL_OCEAN_ACCESS"), + aws_secret_access_key=os.getenv("DIGITAL_OCEAN_SECRET_ACCESS") + ) try: s3.delete_object( Bucket="appdev-upload", Key=f"uplift-dev/user-profile/{user.net_id}-profile.png", ) except ClientError as e: print("Delete error:", e) - raise GraphQLError("Error deleting user profile picture") + raise GraphQLError("Error deleting user profile picture") from e🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 1198 - 1214, The S3 client is created unconditionally even when no deletion is needed; move the boto3.client(...) initialization inside the if user.encoded_image: block so it’s only constructed when you will call s3.delete_object. Specifically, remove the top-level s3 = boto3.client(...) and instead create the client just before calling delete_object in the user.encoded_image branch (keeping the same endpoint_url, aws_access_key_id, aws_secret_access_key), then call s3.delete_object(...) and keep the existing ClientError handling around that call.
847-852: Consider extracting S3 client initialization and URL helpers.The boto3 client initialization and URL construction are duplicated across
CreateUser,EditUserById, andDeleteUserById. Extracting these into a shared helper (e.g., in autils/s3.pymodule) would improve maintainability and ensure consistent configuration.# Example: src/utils/s3.py import os import boto3 BUCKET = "appdev-upload" REGION = "nyc3" USER_PROFILE_PREFIX = "uplift-dev/user-profile" def get_s3_client(): return boto3.client( "s3", endpoint_url=os.getenv("DIGITAL_OCEAN_URL"), aws_access_key_id=os.getenv("DIGITAL_OCEAN_ACCESS"), aws_secret_access_key=os.getenv("DIGITAL_OCEAN_SECRET_ACCESS") ) def get_profile_image_key(net_id: str) -> str: return f"{USER_PROFILE_PREFIX}/{net_id}-profile.png" def get_profile_image_url(net_id: str) -> str: return f"https://{BUCKET}.{REGION}.digitaloceanspaces.com/{get_profile_image_key(net_id)}"Also applies to: 907-912, 1198-1203
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 847 - 852, Extract the duplicated boto3 S3 client creation and URL/key construction used in CreateUser, EditUserById, and DeleteUserById into a shared helper module (e.g., src/utils/s3.py) by adding a get_s3_client() that returns the boto3.client configured with DIGITAL_OCEAN_* env vars and helper functions get_profile_image_key(net_id) and get_profile_image_url(net_id) (with constants BUCKET, REGION, USER_PROFILE_PREFIX); then update CreateUser, EditUserById, and DeleteUserById to call get_s3_client() and the key/url helpers instead of inlining the boto3.client and URL construction so configuration is centralized and consistent.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/schema.py`:
- Around line 1195-1196: The authorization check in the resolver uses
get_jwt_identity() (which returns a string) against user_id (declared as
graphene.Int), causing mismatched types and always failing; fix by normalizing
types before comparison — for example convert get_jwt_identity() to int or
convert user_id to str — and then compare (e.g., int(get_jwt_identity()) ==
user_id or str(user_id) == get_jwt_identity()); update the check around
get_jwt_identity(), user_id and preserve raising GraphQLError("Unauthorized
operation") on mismatch.
---
Nitpick comments:
In `@src/schema.py`:
- Line 1493: Update the description on the GraphQL field declaration for
edit_user (edit_user = EditUserById.Field) to correctly refer to an existing
user rather than a new one; replace the current description "Edit a new user by
id." with "Edit an existing user by id." so EditUserById.Field has an accurate
description.
- Around line 854-875: The except handlers for decoding and upload drop the
original exceptions; update both exception raises to chain the original errors
by re-raising GraphQLError with "from err" / "from e" respectively so the
original traceback is preserved (affecting the try/except blocks around
base64.b64decode(encoded_image, validate=True) and the S3 upload via
s3.put_object where ClientError is caught); ensure the GraphQLError messages
remain the same but use "raise GraphQLError(... ) from err" for the
binascii/ValueError handler and "raise GraphQLError(... ) from e" for the
ClientError handler.
- Around line 1198-1214: The S3 client is created unconditionally even when no
deletion is needed; move the boto3.client(...) initialization inside the if
user.encoded_image: block so it’s only constructed when you will call
s3.delete_object. Specifically, remove the top-level s3 = boto3.client(...) and
instead create the client just before calling delete_object in the
user.encoded_image branch (keeping the same endpoint_url, aws_access_key_id,
aws_secret_access_key), then call s3.delete_object(...) and keep the existing
ClientError handling around that call.
- Around line 847-852: Extract the duplicated boto3 S3 client creation and
URL/key construction used in CreateUser, EditUserById, and DeleteUserById into a
shared helper module (e.g., src/utils/s3.py) by adding a get_s3_client() that
returns the boto3.client configured with DIGITAL_OCEAN_* env vars and helper
functions get_profile_image_key(net_id) and get_profile_image_url(net_id) (with
constants BUCKET, REGION, USER_PROFILE_PREFIX); then update CreateUser,
EditUserById, and DeleteUserById to call get_s3_client() and the key/url helpers
instead of inlining the boto3.client and URL construction so configuration is
centralized and consistent.
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (6)
src/schema.py (6)
1493-1493: Minor: Description says "Edit a new user" but should be "Edit a user" or "Edit an existing user".♻️ Proposed fix
- edit_user = EditUserById.Field(description="Edit a new user by id.") + edit_user = EditUserById.Field(description="Edit an existing user by ID.")🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` at line 1493, The description for the GraphQL field edit_user is inaccurate — update the description string passed to EditUserById.Field so it reads "Edit a user" or "Edit an existing user" instead of "Edit a new user by id." Locate the EditUserById.Field declaration (symbol: edit_user) and replace the description value accordingly to reflect editing an existing user.
1198-1205: Move S3 client initialization inside the conditional.The boto3 client is instantiated unconditionally (lines 1198-1203), but it's only used if
user.encoded_imageis truthy. Moving the initialization inside theifblock avoids unnecessary client creation and credential lookup.♻️ Proposed fix
if get_jwt_identity() != str(user_id): raise GraphQLError("Unauthorized operation") - - s3 = boto3.client( - "s3", - endpoint_url=os.getenv("DIGITAL_OCEAN_URL"), - aws_access_key_id=os.getenv("DIGITAL_OCEAN_ACCESS"), - aws_secret_access_key=os.getenv("DIGITAL_OCEAN_SECRET_ACCESS") - ) if user.encoded_image: + s3 = boto3.client( + "s3", + endpoint_url=os.getenv("DIGITAL_OCEAN_URL"), + aws_access_key_id=os.getenv("DIGITAL_OCEAN_ACCESS"), + aws_secret_access_key=os.getenv("DIGITAL_OCEAN_SECRET_ACCESS"), + ) try:🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 1198 - 1205, The boto3 S3 client is created unconditionally but only used when user.encoded_image is truthy; move the boto3.client(...) call so it is initialized inside the if user.encoded_image: block. Locate the s3 = boto3.client(...) call and the subsequent if user.encoded_image check, remove the top-level instantiation and create s3 within that if branch (using the same endpoint_url, aws_access_key_id, aws_secret_access_key arguments) so credentials and client creation only occur when needed.
847-852: Extract S3 client creation to reduce duplication.The boto3 client initialization is duplicated in
CreateUser,EditUserById, andDeleteUserByIdwith identical parameters. Consider extracting to a helper function or lazy-initialized module-level client.♻️ Proposed helper function
# Add near the top of the file or in a utils module def get_s3_client(): return boto3.client( "s3", endpoint_url=os.getenv("DIGITAL_OCEAN_URL"), aws_access_key_id=os.getenv("DIGITAL_OCEAN_ACCESS"), aws_secret_access_key=os.getenv("DIGITAL_OCEAN_SECRET_ACCESS"), )Then replace inline client creation with
s3 = get_s3_client().🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 847 - 852, The S3 client initialization is duplicated in CreateUser, EditUserById, and DeleteUserById; extract it into a single helper (e.g., get_s3_client) or a lazily-initialized module-level client and replace the inline boto3.client calls in those functions with s3 = get_s3_client(); ensure the helper uses the same endpoint_url, aws_access_key_id, and aws_secret_access_key environment variables so behavior remains identical.
1205-1213: Derive S3 key from stored URL instead of re-constructing it.The code re-constructs the S3 key as
uplift-dev/user-profile/{user.net_id}-profile.png, assuming the storedencoded_imageURL follows this exact pattern. If the URL format changes or if legacy users have images stored with different paths, deletion will target the wrong object—leaving orphaned files in storage.Consider parsing the key from
user.encoded_imagedirectly, or storing the S3 key separately.♻️ Proposed fix
if user.encoded_image: + # Extract key from stored URL: https://{bucket}.{region}.digitaloceanspaces.com/{key} + try: + key = user.encoded_image.split(".digitaloceanspaces.com/", 1)[1] + except IndexError: + key = f"uplift-dev/user-profile/{user.net_id}-profile.png" # fallback + try: s3.delete_object( Bucket="appdev-upload", - Key=f"uplift-dev/user-profile/{user.net_id}-profile.png", + Key=key, )🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 1205 - 1213, The deletion logic in the block that checks user.encoded_image currently reconstructs the S3 key (uplift-dev/user-profile/{user.net_id}-profile.png) which can be incorrect; update the code in the user image deletion path to derive the S3 object key from user.encoded_image (parse the stored URL to extract the bucket/key or read a stored s3_key field if available) and pass that key to s3.delete_object instead of the hardcoded path; preserve the existing ClientError handling and GraphQLError raising but include the parsed key in logs for clarity.
905-917: Consider handling empty stringencoded_imageexplicitly.If
encoded_imageis an empty string"", the code enters this block since"" is not None. Thebase64.b64decode("", validate=True)succeeds and returnsb"", resulting in an empty PNG upload to S3. This may be unintentional.If clearing the profile image is a valid use case, consider deleting the existing S3 object and setting
encoded_imagetoNone. Otherwise, reject empty strings.♻️ Proposed fix (reject empty strings)
- if encoded_image is not None: + if encoded_image: final_photo_url = None s3 = boto3.client(🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 905 - 917, The code currently treats an empty string encoded_image as a valid value and decodes it to b"", causing an empty object upload; update the handler around encoded_image (before calling base64.b64decode) to explicitly reject empty strings by raising GraphQLError (same type used below) with a clear message (e.g., "Invalid profile image encoding.") when encoded_image == "" so only non-empty base64 content proceeds; reference the encoded_image check, the base64.b64decode call, and the existing GraphQLError for the change.
854-875: Chain exceptions and use or remove theerrvariable.Static analysis flags two issues:
- Lines 856-857, 916-917: The
errvariable is captured but never used (F841).- Lines 857, 875: Exceptions raised within
exceptblocks should useraise ... from errorraise ... from Nonefor proper exception chaining (B904).♻️ Proposed fix
try: image_data = base64.b64decode(encoded_image, validate=True) - except (binascii.Error, ValueError) as err: - raise GraphQLError("Invalid profile image encoding.") + except (binascii.Error, ValueError): + raise GraphQLError("Invalid profile image encoding.") from None try: bucket = "appdev-upload" ... except ClientError as e: print("Upload error:", e) - raise GraphQLError("Error uploading user profile picture.") + raise GraphQLError("Error uploading user profile picture.") from e🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 854 - 875, The except blocks capture variables that are never used and don't chain exceptions; change them to properly chain the original exceptions or drop the names: for the base64 call (base64.b64decode / encoded_image) replace "except (binascii.Error, ValueError) as err: raise GraphQLError(...)" with "except (binascii.Error, ValueError) as err: raise GraphQLError('Invalid profile image encoding.') from err" (or remove "as err" and use "from None" if you intentionally hide the cause), and for the upload block (except ClientError as e around s3.put_object) replace the current print + raise with "except ClientError as e: raise GraphQLError('Error uploading user profile picture.') from e" (and remove the unused print if not needed).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/schema.py`:
- Around line 884-900: EditUserById's user_id GraphQL argument is declared as
graphene.String but UserModel.id and DeleteUserById use integers; change
EditUserById.user_id to graphene.Int(required=True), update the mutate signature
to accept user_id as an int, and ensure the DB query uses the integer value
(existing_user = db_session.query(UserModel).filter(UserModel.id ==
user_id).first()). Also make the auth check consistent by comparing same types
(e.g., if int(get_jwt_identity()) != user_id: raise GraphQLError("Unauthorized
operation")) so both EditUserById.mutate and DeleteUserById use the same integer
identity semantics.
---
Nitpick comments:
In `@src/schema.py`:
- Line 1493: The description for the GraphQL field edit_user is inaccurate —
update the description string passed to EditUserById.Field so it reads "Edit a
user" or "Edit an existing user" instead of "Edit a new user by id." Locate the
EditUserById.Field declaration (symbol: edit_user) and replace the description
value accordingly to reflect editing an existing user.
- Around line 1198-1205: The boto3 S3 client is created unconditionally but only
used when user.encoded_image is truthy; move the boto3.client(...) call so it is
initialized inside the if user.encoded_image: block. Locate the s3 =
boto3.client(...) call and the subsequent if user.encoded_image check, remove
the top-level instantiation and create s3 within that if branch (using the same
endpoint_url, aws_access_key_id, aws_secret_access_key arguments) so credentials
and client creation only occur when needed.
- Around line 847-852: The S3 client initialization is duplicated in CreateUser,
EditUserById, and DeleteUserById; extract it into a single helper (e.g.,
get_s3_client) or a lazily-initialized module-level client and replace the
inline boto3.client calls in those functions with s3 = get_s3_client(); ensure
the helper uses the same endpoint_url, aws_access_key_id, and
aws_secret_access_key environment variables so behavior remains identical.
- Around line 1205-1213: The deletion logic in the block that checks
user.encoded_image currently reconstructs the S3 key
(uplift-dev/user-profile/{user.net_id}-profile.png) which can be incorrect;
update the code in the user image deletion path to derive the S3 object key from
user.encoded_image (parse the stored URL to extract the bucket/key or read a
stored s3_key field if available) and pass that key to s3.delete_object instead
of the hardcoded path; preserve the existing ClientError handling and
GraphQLError raising but include the parsed key in logs for clarity.
- Around line 905-917: The code currently treats an empty string encoded_image
as a valid value and decodes it to b"", causing an empty object upload; update
the handler around encoded_image (before calling base64.b64decode) to explicitly
reject empty strings by raising GraphQLError (same type used below) with a clear
message (e.g., "Invalid profile image encoding.") when encoded_image == "" so
only non-empty base64 content proceeds; reference the encoded_image check, the
base64.b64decode call, and the existing GraphQLError for the change.
- Around line 854-875: The except blocks capture variables that are never used
and don't chain exceptions; change them to properly chain the original
exceptions or drop the names: for the base64 call (base64.b64decode /
encoded_image) replace "except (binascii.Error, ValueError) as err: raise
GraphQLError(...)" with "except (binascii.Error, ValueError) as err: raise
GraphQLError('Invalid profile image encoding.') from err" (or remove "as err"
and use "from None" if you intentionally hide the cause), and for the upload
block (except ClientError as e around s3.put_object) replace the current print +
raise with "except ClientError as e: raise GraphQLError('Error uploading user
profile picture.') from e" (and remove the unused print if not needed).
There was a problem hiding this comment.
Actionable comments posted: 2
♻️ Duplicate comments (1)
src/schema.py (1)
899-900:⚠️ Potential issue | 🔴 CriticalFix JWT identity type mismatch in authorization checks.
Line 899 and Line 1195 compare
get_jwt_identity()touser_iddirectly. Here, identity is issued as a string (Line 790), whileuser_idis anInt, so valid users get rejected.🐛 Suggested fix
- if get_jwt_identity() != user_id: - raise GraphQLError("Unauthorized operation") + try: + caller_id = int(get_jwt_identity()) + except (TypeError, ValueError) as err: + raise GraphQLError("Unauthorized operation") from err + if caller_id != user_id: + raise GraphQLError("Unauthorized operation")#!/bin/bash # Verify identity type issuance and direct string-vs-int comparisons. rg -nP 'create_access_token\(identity=str\(user\.id\)\)|if get_jwt_identity\(\) != user_id' src/schema.py -C2Also applies to: 1195-1196
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/schema.py` around lines 899 - 900, The authorization check compares get_jwt_identity() (issued as a string) to an Int user_id, causing valid users to be rejected; update the checks (references: get_jwt_identity and the user_id comparisons around where create_access_token is called with identity=str(user.id)) to coerce the JWT identity to an integer before comparing (e.g., parse int(get_jwt_identity()) and handle ValueError/TypeError by raising GraphQLError("Unauthorized operation")), and apply the same fix to both comparison sites so the types match consistently.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/schema.py`:
- Around line 873-875: Three except ClientError as e handlers in src/schema.py
re-raise a GraphQLError without exception chaining; update each raise to
preserve the original exception (use "raise GraphQLError(...) from e") so the
ClientError context is retained. Specifically, modify the except ClientError as
e blocks that currently do print("Upload error:", e) / raise GraphQLError("Error
uploading user profile picture.") and the two other S3 failure handlers that
raise GraphQLError to include "from e". Ensure all three occurrences use "from
e" alongside the existing GraphQLError messages to satisfy exception chaining
and eliminate B904 warnings.
- Around line 856-857: In the except handlers that read "except (binascii.Error,
ValueError) as err: raise GraphQLError('Invalid profile image encoding.')"
update the raise to chain the original exception (use "raise GraphQLError(...)
from err") so the root cause is preserved; do this for the profile image
decoding handler and the other identical handler later in the file (both
currently binding the exception as err and raising GraphQLError).
---
Duplicate comments:
In `@src/schema.py`:
- Around line 899-900: The authorization check compares get_jwt_identity()
(issued as a string) to an Int user_id, causing valid users to be rejected;
update the checks (references: get_jwt_identity and the user_id comparisons
around where create_access_token is called with identity=str(user.id)) to coerce
the JWT identity to an integer before comparing (e.g., parse
int(get_jwt_identity()) and handle ValueError/TypeError by raising
GraphQLError("Unauthorized operation")), and apply the same fix to both
comparison sites so the types match consistently.
| except (binascii.Error, ValueError) as err: | ||
| raise GraphQLError("Invalid profile image encoding.") |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's look at the file structure and find src/schema.py
git ls-files src/schema.pyRepository: cuappdev/uplift-backend
Length of output: 80
🏁 Script executed:
# Now let's examine the specific lines mentioned in the review
sed -n '850,865p' src/schema.pyRepository: cuappdev/uplift-backend
Length of output: 706
🏁 Script executed:
# Also check lines around 916-917
sed -n '910,925p' src/schema.pyRepository: cuappdev/uplift-backend
Length of output: 719
🏁 Script executed:
# Let's check if there's any linting configuration (pylint, flake8, etc.)
fd -e "pylintrc|\.flake8|setup.cfg|pyproject.toml|tox.ini" -type f | head -20Repository: cuappdev/uplift-backend
Length of output: 237
🏁 Script executed:
# Search for linting configuration files correctly
find . -maxdepth 2 -type f \( -name ".flake8" -o -name "pylintrc" -o -name "pyproject.toml" -o -name "setup.cfg" -o -name "tox.ini" \)Repository: cuappdev/uplift-backend
Length of output: 93
🏁 Script executed:
# Check Python version requirement in setup.py or pyproject.toml
find . -maxdepth 2 -type f \( -name "setup.py" -o -name "pyproject.toml" -o -name "requirements.txt" \) | head -5Repository: cuappdev/uplift-backend
Length of output: 102
🏁 Script executed:
# Search for similar exception patterns in the file to see if this is a systemic issue
rg "except.*as err:" src/schema.py -A 1Repository: cuappdev/uplift-backend
Length of output: 322
🏁 Script executed:
# Also check broader exception handling patterns without the from clause
rg "raise GraphQLError" src/schema.py -B 2 | head -40Repository: cuappdev/uplift-backend
Length of output: 1911
🏁 Script executed:
# Check flake8 configuration
cat .flake8Repository: cuappdev/uplift-backend
Length of output: 178
🏁 Script executed:
# Check pyproject.toml for any Python version or linting settings
cat pyproject.tomlRepository: cuappdev/uplift-backend
Length of output: 182
🏁 Script executed:
# Let's verify the exact context around both occurrences
sed -n '854,860p' src/schema.pyRepository: cuappdev/uplift-backend
Length of output: 345
🏁 Script executed:
# And the second occurrence
sed -n '914,920p' src/schema.pyRepository: cuappdev/uplift-backend
Length of output: 344
Chain decode exceptions to fix lint and preserve root cause.
err is unused and the raised GraphQLError drops the original exception context. Add exception chaining with from err at both locations.
🛠️ Suggested fix
- except (binascii.Error, ValueError) as err:
- raise GraphQLError("Invalid profile image encoding.")
+ except (binascii.Error, ValueError) as err:
+ raise GraphQLError("Invalid profile image encoding.") from errAlso applies to: 916-917
🧰 Tools
🪛 Flake8 (7.3.0)
[error] 856-856: local variable 'err' is assigned to but never used
(F841)
🪛 Ruff (0.15.6)
[warning] 857-857: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/schema.py` around lines 856 - 857, In the except handlers that read
"except (binascii.Error, ValueError) as err: raise GraphQLError('Invalid profile
image encoding.')" update the raise to chain the original exception (use "raise
GraphQLError(...) from err") so the root cause is preserved; do this for the
profile image decoding handler and the other identical handler later in the file
(both currently binding the exception as err and raising GraphQLError).
| except ClientError as e: | ||
| print("Upload error:", e) | ||
| raise GraphQLError("Error uploading user profile picture.") |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# First, let's check if the file exists and its size
wc -l src/schema.pyRepository: cuappdev/uplift-backend
Length of output: 85
🏁 Script executed:
#!/bin/bash
# Examine the code at the specified lines with context
echo "=== Lines 873-875 context ==="
sed -n '860,880p' src/schema.py | cat -n
echo -e "\n=== Lines 934-936 context ==="
sed -n '920,945p' src/schema.py | cat -n
echo -e "\n=== Lines 1211-1213 context ==="
sed -n '1200,1225p' src/schema.py | cat -nRepository: cuappdev/uplift-backend
Length of output: 3470
Add exception chaining to S3 failure handlers.
Three exception handlers re-raise without chaining the original ClientError, which loses the exception context and triggers B904 warnings. Add from e to each raise statement to preserve the exception chain for better debugging.
Locations and suggested fixes
Lines 873-875:
except ClientError as e:
print("Upload error:", e)
- raise GraphQLError("Error uploading user profile picture.")
+ raise GraphQLError("Error uploading user profile picture.") from eLines 934-936:
except ClientError as e:
print("Upload error:", e)
- raise GraphQLError("Error adding new user profile picture.")
+ raise GraphQLError("Error adding new user profile picture.") from eLines 1211-1213:
except ClientError as e:
print("Delete error:", e)
- raise GraphQLError("Error deleting user profile picture")
+ raise GraphQLError("Error deleting user profile picture") from e🧰 Tools
🪛 Ruff (0.15.6)
[warning] 875-875: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/schema.py` around lines 873 - 875, Three except ClientError as e handlers
in src/schema.py re-raise a GraphQLError without exception chaining; update each
raise to preserve the original exception (use "raise GraphQLError(...) from e")
so the ClientError context is retained. Specifically, modify the except
ClientError as e blocks that currently do print("Upload error:", e) / raise
GraphQLError("Error uploading user profile picture.") and the two other S3
failure handlers that raise GraphQLError to include "from e". Ensure all three
occurrences use "from e" alongside the existing GraphQLError messages to satisfy
exception chaining and eliminate B904 warnings.
Overview
Our team had some issues with adding users (specifically user profile pictures) and deleting users on the frontend through GraphQL mutations. This PR addresses some of those issues.
The issues relating to adding users and uploading profile pictures were the first priority. I realized we were uploading profile pictures to Digital Ocean through a
requestwhen we should be using theboto3and passing in access keys to perform WRITE and DELETE operations on the buckets.The issues relating to deleting users stemmed from workouts linked to users not being deleted, and this was easily fixed by adding a foreign key and a cascade delete constraint on the foreign key relationship between workouts and users.
Lastly, I realized on the DeleteUser mutation we were not deleting user profile pictures, so I added that functionality using the
boto3client.Test Coverage
Tested locally.
Summary by CodeRabbit
Updates
Bug Fixes